User:NeilK/Worklog/2009-12-01 to 2010-11-30




Dec 30 2009
============

Problem with uploader at last minute:

seems to be a path issue with an include?

could not parse response_text::<br /> <b>Warning</b>: require_once(PEAR.php) [<a href='function.require-once'>function.require-once</a>]: 
failed to open stream: No such file or directory in <b>/home/neilk/Documents/wmf/extensions/OggHandler/PEAR/File_Ogg/File/Ogg.php</b> on line <b>114</b><br /><br /> 
<b>Fatal error</b>: require_once() [<a href='function.require'>function.require</a>]: Failed opening required 'PEAR.php' 
(include_path='/home/neilk/Documents/wmf/extensions/OggHandler/PEAR/File_Ogg:/home/neilk/Documents/wmf/js2-work/phase3:
/home/neilk/Documents/wmf/js2-work/phase3/includes:/home/neilk/Documents/wmf/js2-work/phase3/languages:.
:/usr/share/php:/usr/share/pear') in <b>/home/neilk/Documents/wmf/extensions/OggHandler/PEAR/File_Ogg/File/Ogg.php</b> on line <b>114</b><br /> 




Jan 21 2010
===========

finally understanding that all the interesting stuff happens in Firefogg.js, which monkey-patch-inherits from mw.BaseUploadInterface.
so it was calling progressBar but not the new way that I wanted

WTFs
however, curiously, Firefogg.js is called even if Firefogg is NOT present or active. Instead it tests if it is activated and then
passes through for every call?!?
***REDACTED***


what is 
upload_modes...

detectUploadMode uses a callback so it can determine stuff async, then callback will modify main object
- autodetect / detect_in_progress
- copyupload
- api (presumably this is chunks?)
- post (presumably this is standard file upload?)


Jan 24 2010
===========
ok stepping through how Firefogg.js gets initialized

This whole live dependency crap is really getting me down. Does it have to be this complicated?
And does it have to be live? It seems we could do this with static analysis when we want to publish. However, this is a little bit easier to test...

it appears that FirefoggGUI.js is not used, although still referred to in code.

FIREFOGG's inheritance, WTF
==============================
Firefogg.js initializes a subclass of mw.BaseUploadInterface, preserving overriden parent elements in "pe_*" properties.
It is called with an options array
  api_url  -- which ensures we don't use POST, because we have API
  enable_chunks -- chunked uploads, how does this differ from having an API? surely they are the same? or not?
  form_rewrite -- true
  form_selector -- the id of the upload file form field, in our case wpUploadFile

BUT THEN
it also initializes its own api url
and preserves BaseUploadInterface's api_url!
This is confusing, inherit it or don't, WTF

similar for 
- api_url
- enable_chunks
- selector  <-- the id of the upload form input field

Hm, not sure why we keep passing around selectors instead of the actual DOM element


ok, so, then, on Firefogg doRewrite, we replace the page form with our own
we can be in another mode called "local" where I guess there is no form to rewrite
   then we would call createControls()
   and bindControls()

instead, we detect that our selector is in fact an input field by looking it up and getting tagName, so
we do setupForm()

which even defers to baseUploadInterface, at least to start...
but firefogg also may create a form element as a child of the next available form element. That seems like an awful hack or something.
I really dislike all this if/then/else behaviour, should be refactored into different interface personalities. With clear separatation of
control and logic
these are fucking confusing, have to ask mdale exactly how many different places we expect to use AMW

ok, so BaseUploadInterface's setupForm() barely does anything -- it just figures out where the form is, 
preserves its old onSubmit
and then adds our own which will serialize JS information into the form somehow once we do upload

n.b. we have this.form_selector, which chooses the form. Not sure where that got configured...

meanwhile... back in Firefogg
yikes, we create a Firefogg 'en passant' while testing for its existence and its suitability to our configured minimum version
getFirefogg is side-effecty -- in getFirefogg() we set this.fogg

aaand, ok, while we are setting up the actual form in createControls, this is how it works
we examine default_firefogg_options --> each option that has 'target' in it spawns a procedure to create html, gets appended to a string. WEIRD.
aaand, then this html string gets appended to the form

- adds a select new file field, to add more, I guess
  -- note: when you select a file, then the button to select the original file is removed, and "select new file" appears
- save ogg button I guess to save files locally
- some messages -- checking for firefogg.. firefogg is installed, install firefogg (broken link), More about Firefogg (wikipedia), Please install Firefox 3.5 (or later), 
Your selected file is already Off or not a video file

...and then all of this is immediately hidden with a chained jquery hide(). OK THAT'S JUST WEIRD

n.b. now all the selectors are renamed. So instead of #wpUploadFile it's #wpUploadFile_firefogg-control

and then, onward to bindControls()
first thing: it does the same sort of loop for "target" controls, and then hides them all over again.
taking the time to build up a comma-separated list called 'hide_target_list', for jQuery, instead of using each or whatever

except this time it hides every fucking thing including the existing file upload

then we re-show the file-upload control
(and a bit later, bind it to selectSourceFile)

then we re-show the associated button and do more or less the same thing.

then we also bind the "save local file" button


ok, popping back up to the anonymous "Setup firefogg jquery binding"
we had created $.fn.firefogg as this function, which I guess is some way to make jQuery plugins
we add a ref to the firefogg object into the form selector element itself, as .firefogg. I wonder why.

bouncing back into mwUploadHelper.init, we also add some sort of "destination check" callback


----

Ok wait WTF?

onSubmit... then what happens?
remapFormToApi -- changes upload form so that when submitted, sends a request to MW API
- why are we doing this?
- "upload form hacks on commons" -- what are they?





This is the call stack for upload status

doUploadStatus()
doChunkUpl...FormData()
doChunkUpload()
doUpload()
(?)()
detectUploadMode()
onSubmit()
(?)()
handle()
(?)()



And here it is for getProgressTitle (the first time)

firefogg.js 
  getProgressTitle()

baseUploadInterface.js 
  displayProgressOverlay()

firefogg.js 
  displayProgressOverlay()

baseUploadInterface.js 
  onSubmit()

baseUploadInterface.setupForm() sets up this callback in the form  
  (?)()

jQuery binding to the form
  handle()
  (?)()




Wait, is getProgressTitle EVER called? This is confusing


stepping through this

when submit is called
first we warp the form, adding various things
line 419 Firefogg.js first
displayProgressOverlay();
  defers to BaseUploadInterface displayProgressOverlay();
  in BUI
    - removes old one now (really should do this at cleanup... or actually in future, we won't at all)
    - creates an #upProgressDialog div
    - makes it a dialog via jquery
    - shows dialog
    - progress-bar-ifies one of the divs

the real work happens in doUpload ... so far have not done any ajax or uploading at all


ok, so, in doUpload()
we first serialize the form data to a JSON array
then we munge it a bit with comments/description, for some commons hack

formdata is only used in a 'post' style upload... not the firefoggy chunked upload

on to doChunkedUpload() -- so far still no communication 

the logic here is totally bogus
it checks if we have editToken already, if so, goes on to doChunkedUploadWithFormData()
otherwise, it calls an async api call to get an edit token, with callback and then go on to doChunkedUploadWithFormData()

this means that doChunkedUploadWithFormData may be called sync OR async from this location, depending on 
state of editToken. We may return right away or much later. That's a bit nuts

so, then, if transcoding, we munge the filename to .ogg... I wonder if Firefogg is actually using this or if we are just guessing
what the filename WILL be. I mean, what about all those dups in my home directory?

onward to doChunkUploadWithFormData

ok, we set up the aReq, which are the args we'll pass to fogg to do its thing
	action = upload
	comment = whatever
	enablechunks = true
	filename = the filename with the OGG format
	format = json
	token = whatever

then we get encoderSettings ALL OVER AGAIN even though we just did that
then we call fogg to do its thing, async, with no callback
then we kick off a process to monitor
so, on to firefogg.js -> doUploadStatus()


our first response text is this
{"upload":{"result":"Warning","warnings":{"exists":"Setyourselfonfire.ogv"},"sessionkey":286876311}}

the updateProgress defaults to BaseUploadInterface


BUG -- "Select new file" will not update the filename. (not editable?)
although after you try it, then will prompt for a new file dialog if you press "Upload"!

BUG - cancelling = blank page

BUG - overwriting files with same name leads to unknown error afterwards
  - what is the right behaviour? Probably check hashes first, then offer to re-upload


=========================================================================================================================
REGULAR USABILITY MEETING -- 

now meeting regularly 2x week

================================================
They are not really interested in Firefogg uploads
Just want to get a prototype for Usability study ASAP

================================================

March 1st:

Target browsers
- Firefox 3
- IE 8
- Safari (n/a)
- Opera (n/a)
Obviously more is better

TODO: Prototype non-JS uploader ASAP

========================================
Look at staged uploads as implemented by "Bryan"

 -- already emailed MDale about this
 -- TODO? reach out to Bryan

========================================
Test plan for new uploads

========================================
Upload requirements




Jan 27 2010
OMGOMGOMG iPad day!!!1!
==========================

bleh, spent too much of the day just catching up with ideas and concepts and emails and people and so on

leaving the old upload page modification for now, maybe not bother until Mdale gets back, tomorrow? next day?

Ok so how are we going to create Guillaume's mockup in time for usability study


Jan 28 2010
===========
delving into creating a "special page", for now in my somewhat abused js2-work dir (although checked out js2-work again in "upload2",
will set that up as a website soon.)

looking at how the upload page is constructed
- lots of options for how uploads are enabled, resulting in different html
  for example, permitted file types
  or whether upload-by-url is allowed

However, in the world of upload2, we will want to create new uploads at will on the page
with characteristics like allowed extensions, etc.

so what is the right thing to do when creating this form? 
  1) create api methods to determine form configuration
  2) have php that outputs "config" for the javascript on the page to pick up 
  3) both!


-- Regular multimedia meeting
+ Michael -- what are the pieces that Neil can reuse
  - reuse
  - simpleupload.js -- hacks in existing form to editing widget, so it can then be added to AMW
+ No-Js
  -- accessibility - blind -- beat it with different UI
  -- corporate environments without JS
    -- default to old form
    

What's going to happen with the usability study
-- present the user an upload task
    - image only
  - with old Upload Form
  - with new prototype
-- must work on protoype live server


Jan 28 2010
===========
Discussion with Trevor on where & how to develop
- Decision for now: 
  - Work inside the Usability Extension in trunk
  - Basically all my stuff will be confined to a Special:Upload2 page 
- Development issues, plans
  - Inter-wiki upload
    - How is this solved? check with guillaume/naoko
      1) Wikimedia projects, file Upload: the Upload file page just links to a Commons server upload. Done.
      2) AMW in Wikimedia projects: ?? We still have to offer them some sort of choice due to fair-use images being stored on
          Wikipedia itself, and common-use, licensed images stored on Commons.
          Technical issues: it is hard to upload to another site for crossdomain reasons 
            - need to open an iframe proxy
            - the actual file form element has to be INSIDE this iframe
                 (this is easier with upload-by-url, since we just pass an ajax call to the iframe)
    - Basically what this means is that a form served on some MW instance (almost certainly a Wikimedia proj for 1st iter) will upload 
      to commons and not to anywhere else 
    - MDale claims this is not an issue for the regular upload page, only for uploads done with AMW

  - Inter-wiki login
    - Can be assesed with an ajax call passed through the api proxy iframe

  - Upload styles
      No javascript
      1) simple form upload with file input, no script -- maybe will leave existing page as is, unaltered,
          or, later, we can update it to be more like the JS version
      2) Upload by URL

      JS enabled
      1) iframe/api - for browsers that can't do XHR correctly, submit a form with action = the api but target = an iframe. Then we read
            data out of the iframe
      2) XHR upload - for browsers that can do XHR
      3) Firefogg client-side transcoding / chunked - for browsers that have Firefogg
      4) Upload by URL


"Stashed" upload - when exactly does this occur?
  -- an unnamed file asset which is stored somewhere temporarily (session) and some metadata
     -- but no file record was created for it yet

QUESTIONS:
What does it mean:
   "checks filename extension .... changes it if not good"
    i18n of js?


Handling multiple files
   - cases:
     - where multiple files are selected by the user... split their selection into multiple downloads/progress bars?
     - where multiple files are dragged into the user interface area

What API changes if any are needed
   - figure out unique filename BEFORE uploading?
   - and unique content hash, maybe
     - need ability to override uniqueness in content hash
   

What code or conceptual changes
   - coordination pages for OTRS licensing
     - any database changes needed?
      - jobs to prompt users later?
   - page to collect all of user's contributions with incomplete metadata
 
What conceptual changes or database changes
   - tokens or other db storage for OTRS coordination
   - "incomplete" uploads
     - which are tied as a sort of "TODO" for the user
     - how to implement?
        - add a template on their user page
        - BUT cronjob will need to be scheduled <-- could be defer till after usability 

How new permission flow works
1) User uploads file needing OTRS permissions process
  - Mediawiki adds template saying this is a problematic file needing OTRS.
  - added to category : needs review
2) Mediawiki emails rightsholder with special super secret URL 
   - n.b. could be a hash of secret + image hash + (?? rightsholder email) for now
3) rightsholder clicks ok
4) Mediawiki receives click
5) Mediawiki emails OTRS
6) OTRS volunteer removes template
   - category removed


Development schedule:
   - emphasis on getting something for usability test ASAP

   - question: interwiki upload -- necessary on Upload page at all?

    new upload page
       - new form, with code loaded from UsabilityExtension
       - add a wg-setting to activate new uploader   (2/2)
       - make it look like and flow like mockups in very modern browsers
         for a simple upload (IE 8, Firefox 3.5+, Safari, Chrome)
                                                     (2/9)
            - auto-rename / replace
            - check file hash
--> upload 
   --> initially we accept filename
   --> optional step 
      -- adding description
      description influences "title" and "filename" in metadata form
       "title" influences filename
       filename influences title
are there apis for uniqueness?


            - extract metadata and make it available in the api....
       - licensing tutorial (mocked) 
       - coordination pages for licensing confirmation (2/16)
       - add appropriate templates when licensing is wrong
            template:Nld 
                                                      (2/23)

       
              

    March
       - Firefogg support
       - Retest support for older browsers and other workarounds
           - iframe/api (old IE, Ff)
           - Firefogg chunked (Ff + fogg)
           we have this today, we just reverify that it's working
 
       - taking over interface for AMW ... interwiki uploading?

     


Uploader January February

Feb week 1 Page   jpeg
Feb week 2 User flow own work
Feb week 3 i found it on the web, permission 
Feb week 4 

Feb week 2.5 staging prototyping

Staff meeting week 1 March


Incomplete uploads start March



Fri Feb 5
=========

Sigh. Still tearing up the universe stuffing everything into UsabilityExtension...

Well, I seem to have succeeded in just ruining everything...

Firefogg uploads do not work... but wait, I had to enable the perms so maybe that works now
no, it's just firefogg... i'll have to fix that

Actually, does BUI even initialize itself?


QUESTION: what about re-uploads?? New interface? Same?



Wed Feb 10
===========

well, blew past the tuesday deadline to get a prototype working

the new way

use the gadget to "deploy" on Commons or wherever we want.

the gadget will suck in the libs via mwEmbed from prototype, which is running js2-work

the gadget itself picks which script to run -- we'll just have it pick a different .js to run when "uploadWizard=1" in the params.

Special:Upload + newUpload parameter 

+ load mwEmbed + whatever else from prototype
- gadget triggers on special:upload + newUpload parameter

JS2

loaded normally

To add gadget on commons must be staff / admin. 

-- TODO to make this work -- let's go all the way
1) add own triggers to js/mwEmbed/remotes/mediaWiki.js  &uploader=true
2) install appropriate gadget on js2-work.keatley.local
-- which can pull code from ITSELF? self-configured? I guess...
-- augh
-- we'll work this out in some deploy.


Howto: define a gadget.
=============================================

1) First, enable the gadgets extension on the wiki in question.
in LocalSettings.php, somewhere near the bottom
require_once( "$IP/extensions/Gadgets/Gadgets.php" );

2) define the following page with the code to run
http://wiki.keatley.local/index.php/MediaWiki:Gadget-mwEmbed.js

3) enable it by adding it to Gadget definitions
- create or edit
http://wiki.keatley.local/index.php?title=MediaWiki:Gadgets-definition&action=edit
Add a line beginning with a star corresponding to the gadget
Format: * name|file|file|file
such as:

* mwEmbed|mwEmbed.js

4) Now it should appear on the Special:Gadgets page.
You can edit the description by clicking the edit link -- it will take you to edit a 
page like: 
http://wiki.keatley.local/index.php?title=MediaWiki:Gadget-mwEmbed&action=edit

5) And noww..... you should be able to add it to your prefs
And so it is

YAY

Another option for local testing is to add the equivalent of the JS to one's skin + custom js. (Under preferences/appearnce)
then it just loads on every page, just like the gadget. The gadget is effectively this but reusable for anyone.

when one wants to share URLs with gadget one can just use a special parameter like with-gadget, I'm not sure where 
that comes in. Might be a general framework for enabling a view as if you had a certain preference?


Thu Feb 11
==========

mdale did a sort of refactoring which separated UI from Upload
In his world, there's an UploadFile, which is a concrete manif of something that uploads
   - we have the BaseUploadInterface
   - and Firefogg
And then there are UploadInterfaces, which are properties of the Uploads, e.g. upload.interface

this is wrong because, mdale is still thinking sort of firefogg centrically
uploadinterfaces are only instantiated at the moment upload begins
that's not really what I intended
let's just push forward -- this is probably at least somewhat helpful.

We are doing various tricks to make this load right in the gadget and still allow for debugging

first, debug=1 in the gadget invocation of mediawiki

second, defer loading most libraries until the actual module


Fri Feb 12
==========

Naoko to review progress...
- some distance on creating the HTML within the bloody 
Will make a dummy uploader that always works


HTML5 declarations -- according to Trevor, doctype=HTML? 
Nimish mentions that IE needs a few lines of JS so it doesn't lose its shit with regards to tags it doesn't recognize

George here at 11:00


lovely meeting w/ george

made a lot of progress on uplaoder form...


Tue Feb 16
==========
Sigh, why do I not actually work on weekends?

ok let's see what can be done.

made a lot of progress -- literally, we now have global progress bars, multiple files.

Wed Feb 17
==========
todos -- 
  file completed should set progress to 100%
  some sort of global completed(), which removes the bar and says HURRAH or plays a sound file.
  remove ability to Browse... once uploading started.

should we allow deleting uploads once uploading has begun?
would have to recalc weights if that happened... so we should move that init code to the updateFileCount(), or call it from both places?

the progress bar should be in some kind of polling loop
is the progress bar even vaguely accurate
why does it jump UP at the end instead of going to zero

TODO!! A REAL FUCKING UPLOADHANDLER FOR THE LOVE OF GOD
Then... xhr
Survey the API and and what happens with 'stashed' uploads or whatever

move the introductory text a bit higher where it belongs


TODO - multiple file uploads splitting up?
TODO - drag & drop API
TODO - file API to determine size


TODO -- make this its own component? how get common libs?


Thu Feb 18
==========

Add real uploadhandler
try to make the second page.

-- hurr, so here's a question
Two ways to do this:
- poll the uploadhandler for its progress, completion.
	- this implies that we don't "store" progress, completion, and weight in the uploadhandler itself.
	- also, we need to keep state about each upload in the uploadwizard -- we have to notice the moment it goes from
		not complete to complete, and then stop other polling processes perhaps
	- how do we handle problems? poll for those too?
- give the uploadhandler callbacks for its progress, completion, problems
	- certainly simpler


Another question -- add things into the uploadHandler object, or create a dict that contains the uploadhandler object
so we can add random data like the weighting
- would remove ugliness like grepping for identical objects, passing 'self' back
- the various queues would then look like dicts... problem? probably not
- actually we could replace them with the single upload dict, with statuses
	

The UH does need some sort of callback to flag problems.
So perhaps it's the callback 




Fri Feb 19
==========
almost refactored api + iframes, yay

getting iframe result errors

also, we get an iframe result "on load" before we really want to

ok we are getting a real result now but it keeps complaining that enablechunks and file are in the same request... and they're not
presumably the filter here is broken

yay unfucked it all -- updated wiki to trunk

and now, the upload succeeds!
we just have to get the apiprocessor to understand that it does. 


Mon Feb 22
==========

Lost a lot of time this morning with big long preso of strategy

OK so goal today is to get API processor to work in an abstract way
I don't suppose wikieditor has this already?

isApiSuccess() ??

just do some generic way to detect success and failure and errors can be filled in later
ugh, do we REALLY have to solve this today?
why are there apparently so many different returns from the API?

And then a start on page 2 would be nice

ok we are sort of working now

the callback method of updating UI and wizard is really annoying
a pure polling method might be a lot better, although scary 

figured a way around the twice-loading iframe problem -- onload (first is blank) it now self-configures to be ready for then next load (API call)

Tuesday Feb 23
==============

(Way of "stashing" uploads....?)

Discusson with Roan about how to extend API

How to override a core API module

class ApiMyCustomUpload extends ApiUpload {
    public function execute() {
        // Do your stuff here
        parent::execute();
    }
}

// and then
$wgAPIModules['upload'] = 'ApiMyCustomUpload'

// other options are to add hooks
// About the hook thing, you can just add wfRunHooks( 'hookname', array( param1, param2 ) ); in any old place
// <flipzagging> ok, so I could theoretically make the 'beforeUploadedFileStored' hook
// <flipzagging> add it to the Upload API in trunk
// <flipzagging> then make my extension to hook into that hook
// <RoanKattouw> Yes

What does the metadata for an image look like?
it's stored in the img_metadata column in image table, 
which could be semi-redundant with the description

How do wikis extract EXIF??
automatically, each media type has a getMetadata method, which then 
makes a serialized PHP structure in img_metadata

It can be obtained from the API thusly

http://commons.wikimedia.org/w/api.php?action=query&titles=File:T-45A_Goshawk_03.jpg&prop=imageinfo&iiprop=metadata

could be used to prime input; particularly DateTime, DateTimeDigitized or DateTimeOriginal if available

n.b. DID YOU KNOW??! titles can be a pipe-separated list to get all at once!
probably not so useful for us as we'll get metadata async


#errorstuff

Conventions around Upload protocol
- a file which is replacing another sets 'ignorewarnings' to true
- nonfatal problems (warnings)
   - badfilename -- was the resultant filename different from desired? If so return what we actually got in the 'badfilename' warnings
   - filetype-unwanted-type -- bad filetype, as determined from extenstion. content is the bad extension
   - large-file -- $wgUploadSizeWarning, numeric value of largest size (bytes, kb, what?)
   - emptyfile -- set to "true" if file was empty
   - exists -- set to true if file by that name  already existed
   - duplicate -- hash collision found
   - duplicate-archive -- hash collision found in archives 
   XXX does a warning always result in a stashed session?

- fatal problems (errors)  -- actually these are not necessarily all different from warnings (for instance emptyfile)
   - EMPTY_FILE empty-file
   - FILETYPE_MISSING filetype-missing  (missing an extension)
   - FILETYPE_BADTYPE filetype-banned (extension banned)
             returns 0, { filetype => the filetype we thought it was, allowed => [ extensions ] }
   - MIN_LENGTH_PARTNAME filename-tooshort
   - ILLEGAL_FILENAME illegal-filename
             returns 0, { filename => verification[filtered] }
   - OVERWRITE_EXISTING_FILE -- overwrite
   - VERIFICATION_ERROR verification-error
	     returns 0, {details => verification-details}
   - HOOK_ABORTED: 'hookaborted', 0, {error => verificationerror }
   - default
        unknown_error
             0, { code => verificationstatus }
 

general form of a response JSON
only one error, since they always immediately die
possibly many warnings

actually, it's hard to upload an empty file.... this just spins possibly the browser doesn't believe it


// this is apparently possible?
{
	upload: {
		result: Failure

}


Here's a fatal error with bad filetype
{ 
	"error": { 
		"code": "filetype-banned", 
		"info": "This type of file is banned", 
		"filetype": "wmv", 
		"allowed": [ "png", "gif", "jpg", "jpeg", "ogg", "ogv", "oga" ] 
	} 
} 



Here's an actual warning with stashed upload
It seems to test for name first
{
	"upload": {
		"result": "Warning",
		"warnings": {
			"exists": "Exotic_flowers.jpg"
		},
		"sessionkey": 1123304744
	}
}


{ 
	"upload": { 
		"result": "Warning", 
		"warnings": { 
			"exists": "Test_name_collision.jpg" 
		}, 
		"sessionkey": 1657701996 
	} 
} 

An error with hash collision
{ 
	"upload": { 
		"result": "Warning", 
		"warnings": { 
			"duplicate": [ "Transamerica.jpg" ] 
		}, 
		"sessionkey": 1773022077 } 
} 

Here's an actual success
{
	"upload": {
		"result": "Success",
		"filename": "ComputerHotline_-_Lepidoptera_sp._(by)_(30).jpg",
		"imageinfo": {
			"timestamp": "2010-02-24T00:37:29Z",
			"user": "WikiSysop",
			"size": 6601931,
			"width": 4288,
			"height": 2437,
			"url": "http:\/\/wiki.keatley.local\/images\/d\/d2\/ComputerHotline_-_Lepidoptera_sp._%28by%29_%2830%29.jpg",
			"descriptionurl": "http:\/\/wiki.keatley.local\/index.php\/File:ComputerHotline_-_Lepidoptera_sp._(by)_(30).jpg",
			"comment": "",
			"sha1": "b0d78a626b09e21e0bed78647e94b14de94fbf89",
			"metadata": [
				{
					"name": "ImageDescription",
					"value": "Atlas"
				},
				{
					"name": "Make",
					"value": "NIKON CORPORATION"
				},
				{
					"name": "Model",
					"value": "NIKON D300"
				},
				{
					"name": "Orientation",
					"value": 1
				},
				{
					"name": "XResolution",
					"value": "300\/1"
				},
				{
					"name": "YResolution",
					"value": "300\/1"
				},
				{
					"name": "ResolutionUnit",
					"value": 2
				},
				{
					"name": "Software",
					"value": "Ver.1.10 "
				},
				{
					"name": "DateTime",
					"value": "2009:08:31 13:45:39"
				},
				{
					"name": "Artist",
					"value": "Bresson Thomas                      "
				},
				{
					"name": "YCbCrPositioning",
					"value": 2
				},
				{
					"name": "Copyright",
					"value": "Bresson Thomas                                        "
				},
				{
					"name": "ExposureTime",
					"value": "10\/4000"
				},
				{
					"name": "FNumber",
					"value": "63\/10"
				},
				{
					"name": "ExposureProgram",
					"value": 3
				},
				{
					"name": "ISOSpeedRatings",
					"value": 320
				},
				{
					"name": "ExifVersion",
					"value": "0221"
				},
				{
					"name": "DateTimeOriginal",
					"value": "2009:08:31 13:45:39"
				},
				{
					"name": "DateTimeDigitized",
					"value": "2009:08:31 13:45:39"
				},
				{
					"name": "CompressedBitsPerPixel",
					"value": "4\/1"
				},
				{
					"name": "ExposureBiasValue",
					"value": "0\/6"
				},
				{
					"name": "MaxApertureValue",
					"value": "33\/10"
				},
				{
					"name": "MeteringMode",
					"value": 5
				},
				{
					"name": "LightSource",
					"value": 9
				},
				{
					"name": "Flash",
					"value": 0
				},
				{
					"name": "FocalLength",
					"value": "1050\/10"
				},
				{
					"name": "SubSecTime",
					"value": "06"
				},
				{
					"name": "SubSecTimeOriginal",
					"value": "06"
				},
				{
					"name": "SubSecTimeDigitized",
					"value": "06"
				},
				{
					"name": "ColorSpace",
					"value": 1
				},
				{
					"name": "SensingMethod",
					"value": 2
				},
				{
					"name": "CustomRendered",
					"value": 0
				},
				{
					"name": "ExposureMode",
					"value": 0
				},
				{
					"name": "WhiteBalance",
					"value": 1
				},
				{
					"name": "DigitalZoomRatio",
					"value": "1\/1"
				},
				{
					"name": "FocalLengthIn35mmFilm",
					"value": 157
				},
				{
					"name": "SceneCaptureType",
					"value": 0
				},
				{
					"name": "Contrast",
					"value": 0
				},
				{
					"name": "Saturation",
					"value": 0
				},
				{
					"name": "Sharpness",
					"value": 0
				},
				{
					"name": "SubjectDistanceRange",
					"value": 0
				},
				{
					"name": "GPSLatitudeRef",
					"value": "N"
				},
				{
					"name": "GPSLongitudeRef",
					"value": "E"
				},
				{
					"name": "GPSAltitude",
					"value": "232\/1"
				},
				{
					"name": "GPSSatellites",
					"value": "0"
				},
				{
					"name": "GPSMapDatum",
					"value": "WGS-84"
				},
				{
					"name": "GPSDateStamp",
					"value": "2009:08:31"
				},
				{
					"name": "MEDIAWIKI_EXIF_VERSION",
					"value": 1
				}
			],
			"mime": "image\/jpeg",
			"bitdepth": 8
		}
	}
}





- possible warnings: 'duplicate' - key matched
		     'exists'    - same name


G
March 3-5
=========
Massive tech meeting planning meeting crap

A lot of thinking.

Although, came to the conclusion that there should be at least two pages -- should not be an all-in-one JS app.

Uploader is a JS app
but then, you should be able to return to your "incomplete" uploads at any time to work on them -- so that's a special page on its own 

The concept of step1,2,3 is probably wrong -- it's upload, then fix, and if not we nag you (bot?)

Special:IncompleteUploads

Should other people be able to work on the queue of Incompletes? Or just me?G

This also implies that every wiki has to be able to set policy, like:
- mustHaveLicense 
- minimumDescriptionLength (in characters)
- requiredDescriptionLanguages ('en', 'fr'...)
- minimumCategories <-- this is implemented by a bot already though

Realized that adding categories is already done (meh-ish) on existing commons page. In fact almost everything is done there already, we just
need to jquery-ize it all.

Also, we really ought to do batching better (or at least as good as) Flickr does. We also want to allow 3rd party batch edits, like
somebody searches for all the pictures of the Eiffel Tower and geocodes them, or adds descriptions, or adds a category, or even removes
a category


March 6-7
=========
Didn't spend the entire weekend coding like I should have


March 8 2010
============
made UploadWizard module within MwEmbed, committed

Should probably nail down what the specs for the test are, asap

Today!! Parse success and do... stuff with it

populate Information

populate Author field for instance

populating Location would be kind of awesome

Then!! batch author / provenance

...Did thumbnailing async, with an iinfo. I should really learn the API backwards and forwards

Talked to Guillaume. He agrees with my idea of a queue of images that need some attention from the user (and/or OTRS) before proceeding
- this implies that perhaps there's a generic "MediaCollection" extension
   - which is subclassed, or obtains PublicationCriteria from yet another extension, like "MediaCollection::Commons"
      e.g. image must have license, minimum description, etc. be as non-annoying as possible, so just make it custom PHP code.


March 9 2010
=============

meeting w/Naoko today
also longer meeting with testing people this aft

- commited loading spinners

- now let's do source, license?


Meeting with gotomedia

- question do we need to provide multimedia files?
   they will ask the user to bring some, but otherwise will provide a few just in case.

Declared feature freeze: March 15. OMFG


March 10 2010
=============

What to do:
- implement renaming files....

Ok so here's the deal
- it would be easier to move the file target if we implement preview in php
  - BUT, then we have to: 
     - ensure that stashed images are servable
     - figure out thumbnailing in PHP, and make it work (somehow) for stashed images
     - implement this as an extension 
     - what's good:
       - this gets us on the right track for the more correct implementation in April
     - what's bad:
       - hard to understand... how do we do this in time... do we even know if temporarily stashed files can be served? how could they be
         served?
         ---> special URL, with special access permissions
         ---> look up the stashed file
         ---> read the serialized PHP
         ---> looks up its location
          ---> modified perhaps by a thumbnail
         ---> cats it out to STDOUT
  - then we implement all uploads as "preview" uploads
     - and we have to be able to obtain info about them with some sort of API method
     - we still would want to put criteria in PHP-land for publishing them
     - not just control it in JS-land --- but we could have our "queue" page
     - we need to store "stashed" stuff somewhere, will have to be PHP for now?  
	
- or:
  - implement Moving the file, in the usual way, controlled by AJAX
   - we can't do this on commons since it forbids renaming files.
   - we'd have to do it on prototype, somehow




--- interesting: chrome has a different look for file upload. Should really get on taking control of CSS here
tada! x-platform styleable file input committed.

-- next: descriptions
--- correct languages
  --  TimedText?
  -- commons?
--- add and subtract the 
--- test how it makes the Infobox?

line 340 of jquery.autocomplete.js -- var data is null? 

sigh


March 11 2010
=============

Not looking good, is it

Okay:
description language
just create a fucking select menu for now

Done, and with a unicode-ibetical sort by name, too
imported some of the logic from UploadForm.js so we don't completely lose things as we go.

Got responses from @Nikerabbit asking why I'm making things more complex, also explaining a few things
(r63628, r63627). He's done good work on localizations


March 12 2010
=============

-- description, and let's do a simple update
descriptions are working, to getWikiText
took a break, fixed up coding style now so we don't have to do it later

March 13 2010 (saturday)
=============
add simple updater now


-- next: filename; and, changing the filename


-- next: licenses
--- OMFG how am I going to do this
--- maybe fake it for now




-- Bug: in Firefox, the remove x is not working on uploads since it's obscured somehow by the file item behind it
-- Bug: in Firefox, the invisible and visible files are not quite aligned correctly

March 14-15 2010
================
up since 8:00pm last night
thought we were demoing to GotoMedia today, but it seems we aren't, just the usual thing

- got date functions working, populating working on general layout and macro stuff
- got importing of templates working -- turns out we need to have extensions like ParserFunctions for them to work
- activating various extensions now that look reasonable to match Commons, (see Special:Version)
- RoanKattouw has a list of license templates for us too
- also got an account on toolserver.



TODO: talk to RobH about DNS for prototype server

March 17 2010
=============
Working the metadata form

goals:
- Licenses
	- how does the license texts thing affect us? Drop down menu?
	- use the same menu that Commons is using now?	
- Submit everything (case "3")

- File renaming
   --> first, get the right stuff in each field, and auto fix upon each other's changes.
   --> then, on change of either, do a check to see if taken
     --> if taken, show the form is in error somehow.
   --> call that function at load as well to see if we need to fix it

ok fucked around a little too much with NTH features like location and so on. Get with the fucking program
licenses are not avail at the moment, need to study commons again
so let's deal with file renaming.
have to figure out the api method for renaming, and then also make that possible locally

stretch goal:
- Categories?
- Case 3, I found it on a website.

- Location tag (hidden, perhaps)
	- seems to be impossible to prefill, mediawiki is eating the relevant items?
		- or, are my test images just fuxored, don't have right exif tags?
		- also, what about XMP?
	- could theoretically add this without prefilled info. Ok, will try...


March 18
========
working on file uniqueness, porting some of michael's work to a more abstract framework
looks actually not bad

- review our status

todo for 1:30pm

- review features to add, estimate time remaining, for 13h30

next todo
- install a JS2 wiki on commonsprototype. Should be able to use same db as the wiki prototype.
- talk with michael today about enabling all compression features, closure, preload

- continue to develop, with a "flow-first" pattern. Error checking later, experience now.


--> Source decision: 6 hours 
   --> own work 1 hour 
   --> selecting licenses 2 - 4 hrs

--> CSS/themeroller:  1-2 days

--> Staging/deployment: 3 hrs

book flights w/egencia
done

worked on staging/deploy as is highest risk

- Michael explained what I need to do to make uploadWizardPage work 
  - flip the config to useScriptLoader to on
  - script loader debug to off
  - could simply splat UploadWizard initialization in an mw.load(function() { ... } ) in the loaded page
    - first generation is slow, next is fast.

- have a js2 environment available on commonprototype:
http://commonsprototype.tesla.usability.wikimedia.org/js2/index.php/Special:Upload
-- still some config wonkiness
  -- complaining about ProxyTools
       Warning: file(/srv/org/wikimedia/prototype/wikis/js2-work/mwblocker.log) [function.file]: failed to open stream: No such file or directory in /srv/org/wikimedia/prototype/wikis/js2-work/includes/ProxyTools.php on line 206
       Warning: array_map() [function.array-map]: Argument #2 should be an array in /srv/org/wikimedia/prototype/wikis/js2-work/includes/ProxyTools.php on line 206
  - Does not seem to have correct path configuration -- it thinks the link should be js2/Main_Page no worky, but js2/index.php/Main_Page is the one that works
    Similarly you need index.php/Special:Upload
  - Haven't tested full uploads
  - Haven't configured it to load all the modules we use in keatley LocalSettings.php

TODO
- get JS2 into a working state so we can test uploads
- sync commonsprototype js2 to that 
- double check how our image uploads work
- try to upload something
  - test that page urls, thumbnails, etc, work


Okay, so file renaming
-- added the fancy destination checker. Todo
-- why is the thumbnail missing now
-- have to make the "about the file" thing follow exactly -- we are trailing by one change

-- filename:
-- the initial filename is wrong


March 22
========
Got in super early added various small features to phase 2 (Details)

Presented to goto media guy -- still very rough!

Banged head against the wall with JS2 trying to figure out why serving times for some things were still in excess of 12 seconds. 
Remaking a URID every time? Mdale tried to help but not much so far. May have to live with this :(

Got to the multiple details submission and realized I would be rewriting the code that handled progressbar would have to be copied all
over again with the same kind of callbacks again

Decided that I needed to make things a bit more abstract -- made it all event based. Reduced LOC by 100 (modifying over 500) and it is simpler
to boot.

Left 11pm


March 23
========
multi-submit
had checkin with Naoko & Guillaume, things looking not so awesome, maybe need to check in again on Friday to pull plug on early April, get next slot late April


March 24
=========
Deeds
getting a bit fancy with my jquery here


March 25
===========


March 26
==========
dead end -- upload more?
most likely action is to add it to a wikipedia page
default caption
no alignment

Website case 
please enter the addresses - files on same page

breaking license default exception

if 1 no exceptions to the rule

fix css it seems to be broken

features:
- configure prototype so URLs work
- configure prototype with licenses, other templates such as Information
- form validation / help 
- thumbnail default caption, no alignment
- for "found it on a website" put files on same page
- for all? put files on the same page but add a "break with license above"
- header not bold
- checkbox not at left of text block
- fineprint not fine
- pointer cursor
- more option fewers options arrow open/close

page 1
- the file input is ugly more obviously clickable
- ok and uploading aren ot isn the css correctly, OK has a checkmark next o it. 
- bar moved right sized
- bar uploads Uploads Editing details
- color size 
- single click upload to file chooser
- alternating tint blocks for files (or lines) - visually separate on step 2
- same on step 3

- shrinking arrow bar
- sliding pages

case "thirdparty"
- "They are not my own work"
- keep " we need information" on top
- link to open the options enter only once
- descrollbar textareas
- "Source"
- examples for source
- popping open help as you focus on fields + don't show
- entering author from EXIF


flow fixes:
- layout
- default thumbnail size 120px (jaggies??)
- no exceptions for 1
- no link in Details thumbnails
- display pics under agreement
- one button submit

Nice to have
- categories??
- geo??


March 28 (sunday)
==================
working from wmf
committed some changes left uncommited on friday
now wtf, wasted too much of the day
configuring commons.prototype.w.org -- figured out the configuration issues preventing upload / rename

March 29 
==========
starting with CSS as maybe it will make people feel better; plus; Guillaume marked these as hi-priority
what is with this jaggy bug with backgrounds in ff 3.6.2
fixed, scaling window
meeting to determine if dry-run this week... no
Guillaume wants to do dry-run very close to actual study, which makes sense
will do friends & family release later in the week instead
rest of day -- worked on the arrow-headers


march 30
===========
let's stay on step 1 for now
work on file button

we start out with:

--> add a file button
--> upload 0 covers that.

after selection:
--> we insert a div for upload 0
--> move upload 0 fileinput to cover that div
--> upload 1 fileinput covers add-a-file
--> add a file becomes "add another file"

okay this actually isn't sucking now.

now, we've changed this from the mocks -- we begin with a "nothing" interface. We need to cue them to upload.
should use the green cross button to add
let's look into theme-roller right now
also, make sure to communicate hover state

todo 
-- hoverstate not working for visible files
or rather, it is working but not doing what I expect.
   - css is broken
   - it's not adding to the files
   - something else interferes with css expression

-- theme-roller?

-- add tint blocks, gradients?

-- add file icons

-- plus icons

-- color progress bar


March 31
========
some progress with layout
spent like 4 hours on a stupid bug with mwEmbed that turned out to be an apache issue


April 1
=======
reprt card
ok let's get page 1 looking like it's fucking supposed to

page 1 not so bad

deed page

-- omg how the hell to do the collapsing stuff. seems almost like accordion boxes
-- author input not copying correctly.
-- why +1 uploads?

April 2
======
spent some time looking at jquery plugins... not easy to find stuff but did find some useful things
and, now it's fucking 2pm
naoko complains broken in ff 3.0.2 -- tried it, does seem like it might be broken, will need to install firebug to be sure 

sigh
hard time getting started
let's go

let's double check what happens with deeds on commons

here are the options

ownwork


OK IS THIS EVEN POSSIBLE

old file
               file       status            result
=====================================================
prototype      old        no JS2            works
prototype      new        no JS2            works 

local          new        false             works
local          new        true              fails

local          old        false             works
local          old        true              fails


CHROME
local
JS2 off, scriptloader off, debug off, old uploadwizard, broken -- file input has not moved
                                                                  + debug=true has similar effect as debug on
                                                                  however, loading with ANY query string seems to fix it?
shift-reloading with everything off, fresh cache, means nothing works
howver, just simple reloading works fine? same url? NO

JS2 on,  scriptloader off, debug off, old uploadwizard, works
JS2 off, scriptloader on, debug off, old uploadwizard, works (slow?)
JS2 on,  scriptloader on, debug off, old uploadwizard, works (slow?)
JS2 off, scriptloader off, debug on, old uploadwizard, works 
JS2 on,  scriptloader off, debug on, old uploadwizard, works 
JS2 off, scriptloader on, debug on, old uploadwizard, works
JS2 on,  scriptloader on, debug on, old uploadwizard, works 


ok, so, why do we get a similar effect sometimes with any file at all, new or old? is it a matter of lacking debug=true at the wrong times?


Firefox 3.5
local


deeds --

ownwork
  does not communicate new checks to the images
  does not remove source visibility




working on css
-- ought to add radio buttons rather than checkmarks to deed choices -- mention that before submit

-- the title/filename thing is now making the last one on the page takeover the first one on the page
-- would be nice to have a generic add button with plus sign


-- the autogrow for title field is not working, it really bollixes up everything.

-- every other field, the usual treatment

-- style fieldsets.




fS2 off  scriptloader off, debug off, new uploadwizard
JS2 on   scriptloader off, debug off, new uploadwizard


remote -- no JS2 even, and, with cleared cache, always works
no matter which file we use?

April 6
=========

- grey out images until license is chosen

Berlin
- seek volunteers
-- slideshow
-- maps tool
-- user gallery
-- XMP/IPTC metadata
-- sharing

Uploader flow
Guillaume presents ideas about workflows, very very abstract.
-- vetoing abstract
-- uploader only queue of images that do not meet this MediaWiki's standard for publication
 -- do not have license
  -- do not have info in description
  -- sucky filenames
  -- can delete file (or fix problem press publish)
-- renaming files (hidden)
-- deleting files (hidden)

manifesto?
3 paragraphs on what we're doing

Berlin: 
-- session ??
-- Wednesday evening?
-- present something?
-- present prototype and few areas where we think volunteers can help

 

April 8
=======
commons.prototype wiki page with log of deploys

- quarterly goals

- next 1/4
   -- uploadwizard done-ish
     -- incomplete uploads done-ish
     -- release on REAL commons optin before Wikimania
   -- metrics 
     -- in place by June
   -- tutorial
     -- artists?
     -- Nina Paley
       -- good artists (who also understand Creative Commons)

- propose to UK chapter


April 20
========
BERLIN!

Trying to do some useful work here :(

What is the gotomedia schedule?

JS2 -> extension

April 21
========
- going to have meeting 5pm CEST

- trying to make deeds work again, sigh
- ok here's how it will work

-> deed-selector 
  --> guillaume's greying out thing
  --> but really, should be a sort of treemenu
 
-> license selector 
  -> at the moment this is just a set of radio buttons
  -> later treemenu
    -> that hooks into deed-selector, which is also a treeselector

-> a deed 
   -> has an interface on the page somewhere (defined in subclasses mostly)
   -> when "applied" to a particular upload, will update the upload's form with:
	- source
	- author
	- license


--> there are uploads on the page
--> there is a macro-deed-selector 

--> each upload always a deed; if only the NULL deed
-->  the NULL deed should prohibit a submission

-->  this deed may be the deed for the whole page OR
-->    by breaking the link to the deed selector
-->    instantiate another deed selector
-->     should also be able to fix that


April 27 back in SF

Sigh, this whole mask-safe-hide thing, as well as the upload thing, is truly annoying
if I make it positioned absolutely, if the element then moves relative to some other element, we're in trouble
should make it positioned inside the div, relatively? does that help?

APril 28 

had a little hiccup -- for some reason could no longer move files. Maybe since I changed accts to NeilK.

ok things are not so bad, fixed relative div, and now about to truly widgetize the deed chooser


April 29
=========
Meeting notes
things before the study

-- split steps 2-3

- expanding arrow-wedge tabs for errors
  -- expand on focus
  -- or click '...'
- similar arrow-wedge for examples
- maybe modal dialogs for discursive explanations.

errors and validation
- missing fields
   - description
   - title
- incorrect input
    - too small too large -- any freetext field
    - source author license signature

-- title validation
  -- do not autorename; throw up filename conflict error on page 3
  -- when title is considered bad, show error ask them to change it
  
-- incorrect input validated asap

instant validation can only go so far
need to be ready for submissions that are non-blocked by validation (?)

-- category widget
Nice to have



Splitting steps2-3
important: whatever transitions the uploads out of 'deeds' step needs to put them in 'details' for the submission to work
I'm assuming we are not going to bother submitting at step 2?
well, we don't have to for study, anyway.



May 3!!
=======
continue to work on 4-pager with wizardy buttons
decided not to use jWizard for the moment, but will return to it a bit later.


May 4
======
submitted expense reports
presented 4-step process to Naoko/Guillaume, synced to commons

n.b. doing a prettier button

Media search button with magnifying glass

<a class="ui-state-default ui-corner-all ui-icon_link rsd_search_button" href="#">
  <span class="ui-icon ui-icon-search"></span>
  <span class="btnText">Media search</span>
</a>

Big blue rounded corner bar, with close [x]

<div class="ui-dialog-titlebar ui-widget-header ui-corner-all ui-helper-clearfix unselectable="on" style="-moz-user-select: none;">
  <span id="ui-idalogu-title-rsd_modal_target" class="ui-dialog-title" unselectable="on" style="-moz-user-select: none;">Add media wizard</span>
  <a class="ui-dialog-titlebar-close ui-corner-all" href="#" role="button" unselectable="on" style="-moz-user-select: none;">
    <span class="ui-icon ui-icon-closethick" unselectable="on" style="-moz-user-select: none;">close</span>
  </a>
</div>


May 5
======
For the hell of it, tried it out in chrome... a few hiccups but more or less works
committed a fix for Chrome

Going to try IE

well, that sucked, it seems Michael's latest changes caused nothing to work. 
First thought it was an IE problem, but it was really due to svn upping.

Tried to diagnose for a couple of hours, gave up, reverting.


May 6
=======

Getting tooltip plugins
qTip looks awesome but is 83K uncompressed (!!!)

tipsy looks excellent, check licensing

sigh, why are none of them exactly what I want?
qtip qould be nice if it was somehow slimmer

gotomedia presentation
licensing scariness
======================
- help -- do not take them away (won't come back)
- licensing -- do not use scary words
- how do people look for help?
- explicit help [ ? ] + rollover



May 7
=======
late today, sigh, overslept

reviewing these bloody tip libraries still


May 10
======
tooltips...

mental note: resolve all the Filename-prefix-blacklist versus the JS-defined bad filenames

May 11
======
in trunk!

tooltips that can pop open more info (examples, etc.)
we'll call a rendered HTML page for the wiki for more info, like so

http://en.wikipedia.org/w/index.php?title=San_Francisco,_California&action=render
and dump that into a modal dialog

XXX whoops, tab order is obeyed even for the hidden elements. ;)
will have to disable all inputs within a "moreinfo"


May 14
=======
well after all THAT nonsense, just trying to get the stupid progress bar to work the way I want it
fixing some damage commits
also the main thing is to get the tooltips working the way they should

god damn it what is that fucking mysqladmin apasswrod


May 17
======

ok now what
I would really like to get my mac laptop online...

should I solve the database update issue instead?

once again JS2 configuration misery

god damn it, JS2 config confusion until  3:00pm
did catch a good bug in JS2, though.

ok, now the thumbnail is never loading
possibly due to the warning we are getting...

no, actually it seems to skip over that just fine
what is unusual is that _this.title in getThumbnail  is "false:NeilK_blah_blah"
  should be image or that constant...
  titles? why titles?!
  
aha the namespace trickery did not make it over when mdale did his thing.
fixed

ok now we can't get off page 3... 


May 18
=======

okily dokily, messages file working, we even have translations already
we asked them to hold off and the translators are 'taking it out of rotation'
one must contact siebrand or put in a support request 


May 22
======
Google international keyboard javascript widget?
virtual keyboard ? on-screen keyboard ?


May 24th
========
Okay going to demo this -- let's do some prettiness
- get templates on the fucking server
get templates locally

- finish validation?

- feedback --
  - Pick another license
    * Increase font size in help bubbles and second line of deed (n)
    * Remove automatic welcome message on sign-up on Commons prototype (n)
    * Add spinner while Step 1 loads (n)

- general prettiness




Let's review how licenses work.

There is a string in the system, 'Licenses'
Which can be overridden in the usual way, by making a wiki page MediaWiki:Licenses
This will contain text like this:
* subst:nld|I don't know what the license is
* Your own work (best practices):
** self|GFDL|cc-by-sa-all|migration=redundant|Own work, copyleft, attribution required (Multi-license GFDL, CC-BY-SA all versions)  
** cc-zero|CC0 waiver, all rights waived (Public domain)
** PD-self|Own work, all rights released (Public domain)
** self|GFDL|cc-by-sa-3.0|migration=redundant|Own work, copyleft, attribution required (GFDL, CC-BY-SA-3.0)
** self|GFDL|cc-by-3.0|migration=redundant|Own work, attribution required (GFDL, CC-BY 3.0)

* Not self-made, but has been released under:
** Creative Commons licenses
*** cc-by-sa-3.0|Attribution ShareAlike 3.0
*** cc-by-sa-2.5|Attribution ShareAlike 2.5
*** cc-by-3.0|Attribution 3.0
*** cc-by-2.5|Attribution 2.5
** Free Art License
*** FAL|Free Art License
** Flickr photos
*** subst:template 2|flickrreview|subst:nld|Image from Flickr and I do not know the license
*** subst:template 2|cc-by-sa-2.0|flickrreview|Uploaded to Flickr under Creative Commons Attribution ShareAlike 2.0
*** subst:template 2|cc-by-2.0|flickrreview|Uploaded to Flickr under Creative Commons Attribution 2.0

* Public domain:
* subst:nld|I found the image on Google or a random website
** PD-old|Author died more than 70 years ago
** PD-Art|Reproduction of a painting that is in the public domain because of its age
** PD-US|First published in the United States before 1923
** PD-USGov|Original work of the US Federal Government
** PD-USGov-NASA|Original work of NASA
** PD-USGov-Military-Navy|Original work of the US Military Navy
** PD-ineligible|Too simple to be copyrighted, not your own work
** subst:Template 2|Trademarked|PD-textlogo|Logo with only simple text (wordmark)

* Other alternatives:
* subst:nld|I found the image on Google or a random website
** Fair use|Fair use image (Not allowed on Commons. Image will be deleted.)
** Copyrighted free use|Copyrighted, but may be used for any purpose, including commercially
** Attribution|May be used for any purpose, including commercially, if the copyright holder is properly attributed

Because this is just a string, in practice, it seems to actually change from language to language. (!!)

Anyway, this will be transformed by SpecialUpload.php, which will call upon includes/Licenses.php 
to transform it into a simple select menu, with outrageous formatting.


More info (and examples and so on) with
http://commons.wikimedia.org/wiki/Commons:Licensing

Might be good for examples...


ok, have a nice crossfade toggler working; just need to figure out height BEFORE we patch into the dom, or somehow create the 
interface after it's in DOM.

===================================
Must fix week of June 7-11
  ending buttons
  upload / next button
  error messaging on upload issues
  validation page 3
  locking and progress page 3
  errors page 3
  -- spinner while loading -- DONE

niceties
  "to use the file" -- add Wiki note.
  arrowSteps -- the UL is 600px, and the individual LIs are 126px, but somehow this works out to the UL being too wide by two pixels.
    there's got to be some way to make this dynamic, just like the page.
    arrowSteps RTL
  noscript fallback
  stop or cancel uploads in progress?
  discussion link -- DONE

commons.prototype server
   - copy templates 
   - remove account new messages -- DONE 
   - check that we are receiving optimized JS -- NO!! still broken. emailed mdale. 


Sigh ok this is not fucking working on commons
- Nash reports that there's just a blank screen
- can replicate this:
-- ScriptLoader off + No Firebug = blank screen
-- ScriptLoader on + No Firebug = working-ish, but no CSS (various things don't load, we know this already.)


jun 8
=====
problem with major breakage due to a js2 update
mdale fixed it overnight, but now we have the pluralizations not working, which affects the deed forms too.
let's see if turning the scriptloader off again fixes it

yes -- scriptloader definitely fucks the plurals up

mdale seems to have fixed things, ish
communicated it back to nash
ok onwards.
now all the image loading on my computer is broken

sigh all right fixed I guess

ok let's get templates up?

June 9
======
call with gotomedia, answering lots of questions

getting there -- withtemplates
refactored licenses to use real templates, non-hardcoded strings/

added in template strings for Information (hoping for monthnames??)
ok so... here's what is happening, somehow
the language is being rendered as <lang>, why??!?
have to trace this down through ISOdate, sigh


June 10
=======
call with nash
things to do 

images -- neil to obtain or take images

Naoko -- create accounts on wikimedia commons & prototype & sandbox?
      -- send info about AMW to Nash

Neil -- obtain golden gate bridge (or other landmark) images, communicate URLs to nash
-- make blog entries for thirdparty case, communicate URL to nash
-- make licenses distinguishable helpable -- update prototype

Naoko & Neil
 -- send test script feedback


17th noon -- dry-run
remote
Nash -- send info on how to connect



refactoring the transitioner and the progressbar
almost working -- just need to find who it is who is setting progress to 1.0 immediately


June 11
=======
Snow Day

TODO (Known issues to fix before study)

    * "Use" fields sometimes cut off portions of URLs, wikitext
    * Upload page -- choice between upload / next button should be clear (hide or disable next button)
    * Error messaging on upload issues (suppressed for development, just re-enable)
    * validation details step (including uniqueness of file (suppressed for developement))
    * errors page 3
    * progress bar should only appear when we have useful information about progress -- otherwise use a spinner

layout and messaging

    * "to use the file" -- add Wiki note.
    * arrowSteps & wizard lower-right next button -- make both of them flexible size or both fixed size
    * explain licenses better
    * it would be nice if page 2 didn't "flash" while laying itself out.
    * Monthnames in Infoboxes are missing
    * Alignment of upload status column, total uploads on upload page
    * use an orange arrow spinner for "uploading..." on upload page

Commons.prototype server

    * copy licensing templates
    * copy filedesc/infobox templates
    * remove account new messages
    * check that we are receiving optimized JS
    * Verify if apache / php settings allow for long uploads -- mac firefox upload of large files failing? (received by php /tmp just fine, but no response given to browser). Update verified settings, still unsure what the issue is with mac firefox.

Fallbacks

    * noscript fallback (this is designed in, just need to dump old form in there)p

[edit] 

==========
June 14 
dry-run


==========
June 15

- meeting, future of Ford contract

Event at Ford foundation?
October November

- Major milestones
- meeting about incomplete uploads
- tutorial

- Rob Lanphier on QA - manage relationship with QA vendors
  - Calcay - Sri Lanka provider - QA on demand
    - develop Browsers / suite of tests
   
June 16
=======
hospital (foot injury)

June 17   
=======
Fleischman / GotoMedia testing

suggested changes

- validation already


- Guillaume was right all along, remove their ability to even change the title -- need a new server-side function to suggest a title?
  - if so, how to detect that we want to replace something? -- maybe that should ONLY work from the file page

- remove ability to edit the "copy this HTML" ? -- capture keyups?  -- readonly="readonly" is a valid property; add background: #fffff to mwe-long-textarea

- wording change -- page to wiki page

- Use: need to know how to find this again -- my contributions

- upload from the future (verify dates)

- final page -- need some assurance they can find these things again (my contributions)
- disable "next" if filename check is not performed yet

- radio buttons for licenses
- thought -- maybe we should make it more obvious who uploads to wikimedia (users like you, so many today, etc.)
- bar information -- subtract time for uploads started simultaneously
- we really need a progress bar from the beginning, somehow
- functional license chooser -- like http://creativecommons/choose/
- lines between images on final page
- for errors, use ui-state-error with triangle icon

worked a lot on rationalizing layout - particularly close boxes and the first page with multiple file inputs.

June 21
======
pretty much finished that layout change, but a bit scared to commit -- should work on the simpler changes first
- still some bugs -- is not covering the file upload links correctly now, is not triggering underline
- should make a nice icon in GIMP for closing

        // return boolean if we are ready to go.
        // side effect: add error text to the page for fields in an incorrect state.
        valid: function() {
                // at least one description -- never mind, we are disallowing removal of first description
                // all the descriptions -- check min & max length

                // the title
                // length restrictions
                // not already taken

                // the license, if any

                // pop open the 'more-options' if the date is bad       
                // the date
                // other information 
                // no validation required
        }


- description label block, clearfix?
- should check if exif date is right (off by one errors??)
- users don't know that date is derived from exif?
- checking for bad characters in filenames
- dammit tipsy's have to move with their divs. Make them position: relative or something...
- isodate no worky
- ought to get a template preview in HTML... then can reuse that HTML for the link to it thingy.


June 22
=======
study

June 23
========
setting up vmware etc.

June 24
======
meeting -- reboot
http://usability.wikimedia.org/wiki/Multimedia:Meetings/2010/06/24

June 25
========
finish unfinished changes to step 3

June 28
=======
what the hell was I thinking about error messages
Okay, here's the problem
We want the error message to push the entire line downwards, like so

[              | Error with this desc    ]
[ Description  | lang      |   desc      ]


We don't want this

[ Description in | Error with this desc   ]
[                | lang    |   desc       ]


We don't have space for this

[ Description in | lang    | desc        ][ <- Error! ]





clearly there should only be a 'required' parameter,


June 30
======
eliminated the problems which were causing rules to not find its form
upgraded to 3.6.6 --- ugh, unfortunately no dpkg, so have to launch manually from the cmd line
anyway, that fixes the stupid errors causing ffox to have a heart attack whenever it can't find parent element
wasted time trying to upgrade to a different apt source, ok, now I don't have ANY firefox that works from the GUI in a normal fashion
also, the installed version is 3.6.5pre. Waste of time if you ask me.


added a human-readable title function

JULY 1
=======
TODO
hey, where's the bloody add descriptions button
description isn't required (?)
prototypal state of 'more options' buttons

ARGH THIS IS SO IRRITATING
anyway after futzing with the date field in a million ways now it doesn't eagerly create its own error msg, nor does it use the one I have.
Strange. Changes:
- now I can't type anything other than digits and dashes into the field, which is good.
- but I also don't get eager validation, or at least it's not apparent
- it is hiding the label, so the association is correct
- does this mean it's allowing 9999-99-9999999999999999999 as a valid ISO date? Must trace this.

July 2
=======
ok the problem is, I just changed the name and id
seriously what's up with this name or id stuff
why why why why

also why is the outline different when it goes to red
maybe we don't need to do that...

layout irritations
the errors on top means tooltips and calendar widgets and so on may lose visual connection with field
as everything moves down or up
can't see any other way to do this with current layout

errors should flash when you press next

TODO
DONE - fix validation -- make file uniqueness work like rest of validation.
  - actually... let's not
  - for now, we'll just hack it so it appears very similar, but has the wrong class
  - otherwise it will keep blanking the label


DONE - make calendar inline. So that errors simply can't occur -- add more affordances to quickly change the month, year.
- n.b. need to internationalize the calendar


Todo: 
validate languages  -- 
  what does the commons form do again? Doesn't even allow unknown language I think...
    hmmm.. for unknown language commons form simply unwraps text. If you have multiple unknown langs it concats
    I think that's bogus
fucking categories

maybe make that language widget


- restrict inputs on titles (no returns, for instance) -- how is isodate doing that?

- try experiment with tipsyplusplus??
- fix tabbing so you can't tab to a hidden elt like the calendar
- add server-side file name suggester

- error the filename/title when it's obviously bad.

we should make it an error


July 6
======
readded the description adder
investigated ajax categories. it is the suck -- really only good for adding / subtracting one cat at a time
working on categories with autosuggest... more promising

July 7
======
ok let's get the show on the road with autosuggest
n.b. -- like hotcats, should allow the creation of new categories too!
autosuggest doesn't quite do that. Or does it? Options?
may need an autocompleter input with an "add" button. More like ajaxcategories

Hotcats seems to send two different requests (?)
Input text is "He"

REQUEST:
http://commons.wikimedia.org/w/api.php?format=json&action=opensearch&namespace=14&limit=30&search=Category:He
namespace 14 is cats

Response is JSON, thus
["Category:He",
 ["Category:Hemiptera",
  "Category:Heracles",
  "Category:Heinrich Gustav Reichenbach - Xenia Orchidacea",
  ... 
 ]
]

But also REQUEST
http://commons.wikimedia.org/w/api.php?format=json&action=query&list=allpages&apnamespace=14&aplimit=30&apfrom=He

with response:
{
  "query":{
	  "allpages":[
		{"pageid":4295380,"ns":14,"title":"Category:He-111"},
		{"pageid":9967215,"ns":14,"title":"Category:He Guoqiang"},
		{"pageid":3657742,"ns":14,"title":"Category:He Hua Tempel"},
		{"pageid":8234206,"ns":14,"title":"Category:He Kexin"},
		{"pageid":7998628,"ns":14,"title":"Category:He Ping"},
		{"pageid":10258885,"ns":14,"title":"Category:He Wenna"},
		{"pageid":5297232,"ns":14,"title":"Category:He Zhili"},
		{"pageid":8415147,"ns":14,"title":"Category:Heacham"},
		{"pageid":69764,"ns":14,"title":"Category:Head"},
		{"pageid":9360341,"ns":14,"title":"Category:Head-Smashed-In"},
		{"pageid":5942202,"ns":14,"title":"Category:Head-binding domain of phage P22 tailspike protein"},
		{"pageid":5038788,"ns":14,"title":"Category:Head-driven Phrase Structure Grammar"},
		{"pageid":8838368,"ns":14,"title":"Category:Head-on pass"},
		{"pageid":8704147,"ns":14,"title":"Category:Head-related transfer function"},
		{"pageid":6252004,"ns":14,"title":"Category:Head-to-tail joining protein W, gpW"},
		{"pageid":6567807,"ns":14,"title":"Category:Head-up displays"},
		{"pageid":4625784,"ns":14,"title":"Category:HeadWatersTrails Inc."},
		{"pageid":8812938,"ns":14,"title":"Category:Head (hieroglyph)"},
		{"pageid":9395585,"ns":14,"title":"Category:Head Office of Kawasaki Bank (Meij-Mura)"},
		{"pageid":1034203,"ns":14,"title":"Category:Head and Neck Pathology"},
		{"pageid":8092424,"ns":14,"title":"Category:Head and neck pathology"},
		{"pageid":9856891,"ns":14,"title":"Category:Head cameras"},
		{"pageid":8819935,"ns":14,"title":"Category:Head cheese"},
		{"pageid":5942117,"ns":14,"title":"Category:Head decoration protein D (gpD, major capsid protein D)"},
		{"pageid":5941685,"ns":14,"title":"Category:Head domain of nucleotide exchange factor GrpE"},
		{"pageid":4653567,"ns":14,"title":"Category:Head gaskets"},
		{"pageid":5846666,"ns":14,"title":"Category:Head morphogenesis protein gp7"},
		{"pageid":7094964,"ns":14,"title":"Category:Head of Jesus"},
		{"pageid":8429808,"ns":14,"title":"Category:Head of Muir"},
		{"pageid":9722338,"ns":14,"title":"Category:Head of Nike in the Ancient Agora Museum (Athens)"}
	  ]
  },
  "query-continue":{
     "allpages": {"apfrom":"Head of Orpheus"}
  }
}
  
Then, when a cat is selected, further info is fetched:

http://commons.wikimedia.org/w/api.php?action=query&prop=info%7Clinks%7Ccategories&plnamespace=14&pllimit=50&cllimit=10&format=json&titles=Category%3AHell

important -- get the localized name of Category, or it won't work


 July 8
======
health issues, not so much done



July 9
====== 
creating mwCoolCats based on jquery.tags.js by Byrne Reese

wtf - why does it cause autoselection -- pressing "add" submits??

it replaces the text input, but in our case, we actually have no need to do this. We're collecting wikitext

wtf uniqueness check is broken?

July 12
=======
okay category suggest seems to sort of work

- fix category links
- make the wikitext actually work
- hierarchy?

- examine hotcat a little more closely -- how it obtains cats & submits, use its strategies
- reread the feature requirements lupo posted.

Okay, we left off with prototype apparently not entering category wikitext
like it was stuck on previous version?

Met with alolia (or what is her name now?), briefed her on issues

July 13
=======
People are back! Some, anyway

ok the cats are working now on prototype, sent mail to Guillaume
the problem is caching  -- need to delete JS2's cache folder i.e.
   
   rm extensions/JS2Support/mwEmbed/includes/cache/*  

Supposed to work on portability now... all right

Step 1 -- move to mac for development, permanently
-- edit and svn files natively, use them in a mediawiki install?
-- somehow install ctags / cscope (ports, whatever??)
-- or join the masses, use eclipse...

Step 2 -- get ISOs for working on various OS -- at least:
IE 7
Firefox 3.5+ (Mac, Win, Linux)
IE 6 
Firefox 3 (Mac, Win, Linux)
Safari 4 (Mac)
Chrome 3+ (Mac, Win, Linux)
Opera 9.5+ (Mac, Win, Linux)

Step 3 -- start some sort of bug list

Step 4 selenium? unit tests?



Personal TODO
- git init for transitHeatmap
- figure out some system for transferring vim config

July 14
========
created JelloMold as a separate js extension

July 15
=======
moving over to the mac laptop finally
everthing is ALMOST working, web config of mediawiki is erroring out with -2002 error which I think means it can't find mysql.sock
it's defined in php.ini, but we're doing such odd stuff with php from ports and apache from apple that maybe it's reading the wrong ini
trussing the startup of httpd reveals no ini reading... maybe it reads it later?
Priyanka suggests introducing errors into syntax -- good idea, will try doing that to either of the available inis. Maybe NO ini is being read.

July 16
=======
Various other issues setting up mac as main dev machine
phpinfo() tells us what ini files were loaded
was looking in wrong path
also, some issues with PHP4/5 auth -- changed passwords to be old_password(), but it turned out we wanted the new password() after all.
ALMOST FUCKING THERE!! alright fixing some warnings in JS2Support that occur with strictness.
ANNNNND now we just have to make '/wiki/' work as a path... how did we do that before?

before we do anything else we should make ctags or cscope work

1:1 meeting with Alolita
***REDACTED***

July 19
=======
convo with trevor about how to deploy
Most difficult thing will be getting ext into core
 - plan a phased rollout 
   - dark launch
   - opt ins
   - try beta
   - etc.
- Roan gave preso on "why your extension doesn't make it into core"
- figure out who is managing all that

- Wrote a lot of estimates for work for the next few months, sent out to Alolita & Guillaume

Major risk / uncertainty areas:
- JS2Support - Scriptloading. According to Alolita Trevor is thinking of making UploadWizard a test project. Hmmmm
  - consider eliminating
- Stash file system


July 20
=======
Okay, it seems I already have the power to modify Commons' interface. So, hurray for me.
Should find out about sitenotices and that sort of thing

Sitenotices can be defined in MediaWiki:namespace -- MediaWiki:Anonnotice and MediaWiki:Sitenotice
Just enter wikitext
can also be configured with wgSiteNotice

A "take me back" interface on top of page, like Vector rollout, is unnecessary
We just need a "switch back to old interface" button on the page or "opt in to old interface" kinds of links.

- Does this set a preference?
- Sets a cookie?
- Or what?


Discussion with Ryan Kaldari about cycling images on front page
Picture of the day is the best to use because:
  - It's not the same as "Featured" which is just well-annotated stuff
  - Anyone can edit the queue of pictures of the day, so complaints could in principle be resolved


Stashed file system
===================
Requirements:
- we need to store the raw file
- (NEW) and obtain thumbnails
- (??) we need to store metadata associated with that file
- (NEW) we need to be able to add/edit the metadata 
- (NEW) aaaaaannnnd then at a moment of our choosing we integrate into MediaWiki
- (future) will be desirable to put an admin or some checker in there, for new users...

How does it work now?
important files:
api/ApiUpload.php -- stashes file
specials/SpecialUpload.php -- the upload form stashes the session 
upload/UploadBase.php -- stashSession() abstract?
upload/UploadFromStash.php -- another stashSession

In some ways it works similarly to how we imagine -- 
  upload 
  file is stashed -- with metadata
  but then errors can occur
  so upload form is shown again
  
How is thumbnailing done, anyway?


How we do it now -
ApiUploadHandler we configure a form with a file input, with various things like token and so on
IFrameTransport also configures iframe to transport file to server, makes that the target of the form
adds callbacks for it to report success


Spent a lot of time investigating.

Lars asked about very large uploads...
whether a 100MB upload will crash apache -- nope
and then trying to download or scale up very large images

reading the PHP source led me to 
rfc1867 -- potential way for doing progress bars -- mdale points out is not easy to make it work with clusters (need to refer to memcached) and 
the client "should" know what it's doing anyway.

Looking into FileAPI and what can be done

Tim Starling comments about PLupload... this is very interesting
uses multiple runtimes (Flash, Silverlight, Gears, etc etc) to do a more usable upload.
It's helpless on IE6 without Flash, but we can assume that most people have Flash anyway.



July 21
=======
Guillaume returns...

Got Magic Mouse and other stuffs
need some sort of keyboard re-lay-outer if I'm going to use vim -- remap esc to caps lock or the menu item, maybe
restarting...
tried about six different key re-layouters, WTF why can no one make this work. Do I have to learn the keylayout format myself?
Some advocate: use a tool to change what caps lock does, make it send something else like 'help', and then remap help to mean esc in yet another tool.
Maybe this is best solved within vim itself, if that's even possible.
on this kb, another possibility is to use the windows menu key located under right thumb.

misery of figuring out why JS2Support doesn't work on IE6 when ResourceLoader (aka ScriptGrouping) is on.
It at least loads all libraries, and reports bugs, but we can't tell where those bugs are.

Installed various debugging tools.


If we have ResourceLoader/ScriptGrouping off, then random failures because some items don't load.
renamed resourceloader config option to ScriptGrouping, since that seems better. Will run that patch by Mdale later.

Tracing through execution (debugging w/ print statements) to pseudo-console suggests that the framework thinks everything is hunky-dory. It tries to load everything, and succeeds.

some help from Michael, suggested that items not loading is due to grouping. Splitting everything into single "groups" in sequence seems to make it predictable, but it's still not *working*

saved the files with resourceloader on and off, in IE6, to disk and using vimdiff to compare.

fixed up my Spaces config for easy switch between vmware and not.

calling it a night (11pm)

July 22
=======
vim tip: interesting: ctrl-[ is the same as mode switch.
also possible to remap quickly typed keycombos, using :imap 
interesting -- alt-up to complete. Ought to get that working properly, along with cscope...


got better vimdiff syntax coloring, which clarified some things.
It occurs to me that we really ought to compare the failing IE6 files to *working* examples of both. ;) Ok! 
using ffox.

hm, with urid's removed, the source is utterly identical
this suggests that injection of the CSS or JS is what is failing, maybe. I don't see where (in the source) this happens
or, possibly the save happens before modification of nodes


July 26
=======
staff meeting
chatted w/ trevor briefly about debugging ie6
long meeting about planning uploadwizard development
still difficulty debugging ie6
  - invokes script debugger; trying to get it to invoke MS VS Express 2005. No luck so far. Must be a registry setting but haven't found it.
  - in any case, may not be any better.
Talk to Trevor tomorrow about maybe debugging better on IE6. See if he has any tricks
Pinged Danese about work situation / green card and post novemeber situation. She is starting process, will get to it wednesday? 
  !! she is going on vacation soon

July 27
=======
todo
- talk to trevor

- started to try to use script tags, realized, hey wait a minute, I still have to wait for all scripts to load... the window.onload
method is standard
hm, it seems that IE uses a nonstandard way to tell when things have loaded -- 
does JS2 do this?
http://dean.edwards.name/weblog/2005/09/busted/
   <!--[if IE]><script defer src="ie_onload.js"></script><![endif]-->
And similar. But, you need to be able to stop it from loading twice, so the onload handler should check some sort of flag.

- minimum requirement: some way to debug IE6

Trying to schedule 


July 27
=======
god dammit filming is happening here.

not making a lot of progress understanding scriptloader, moving to other stuff I can do today

installing various upgrades to vim...
php style checker


Jul 28


Jul 29


Jul 30

Aug 1

Aug 2 

Aug 3

Aug 4

Aug 5


Aug 6
======
Skype meeting w/Robert M Harris, Ryan Kaldari
* answered tech questions, kicked around various ideas
* surprisingly up to speed, has read discussions going back many years
* assured no opt-out censorship (e.g. SafeSearch)
* uses "we" to refer to Wikipedia, identifies with project
* believes there are unstated principles in wikipedia, perhaps like "don't put wikipedia into disrepute"
* understands distinctions between Wikipedia & WM Commons
* believes that controversy defines controversy, not content per se. Believes that defining images in question is thus straightforward -- it's sex & violence.
   Islam question not part of this (?)
Future discussion -- how much wikipedia's core philosophy is compatible with any sort of censorship

Yet another meeting Trev Guillaume Alolita Mdale on JS2
- message strings -- mdale defended full parser saying it was necessary for strings that we deal with. (Verify?)
- trevor still wants a simpler option since wikieditor doesn't need plurals/genders



Aug 9
======
meeting, monday whatever
meetings, updates, planning UploadWizard
meeting w/Danese, catchup
working on language -- how to replace parser hook to use our own PLURAL and GENDER

./extensions/I18nTags/I18nTags.php:     $parser->setHook( 'plural',    array($class, 'plural') );

Aug 10
======
Let's try just brute force -- changing the parser hook whilst we are using it
hm, it seems that the hook is modified at some other point, just passing a language through doesn't do anything (?)
./extensions/I18nTags/I18nTags.php:	$parser->setHook( 'plural',    array($class, 'plural') );
./includes/parser/CoreParserFunctions.php:		$parser->setFunctionHook( 'plural',           array( __CLASS__, 'plural'           ), SFH_NO_HASH );

step 1 - can we even get normal message processing to work
step 2 - in other languages
step 3 - modify this


trying to figure out how in the hell the parser preferences are even set
it's not obvious that this is even working the way it should

Also, it's possible that these language properties are set globally in some way related to user lang, so I can't flip over so quickly

And what is the role of I18NTags.php here? It isn't enabled locally.

parsing {{ }}...
-> registered with LanguageGetMagic hook
-> is it a variable is (?? what is this?)
-> parser function? registered with $wgParser->setFunctionHook()
-> otherwise, assumed to be a template.

August 11
=========
Deeply annoying, de-JS2-ifying this stupid application
Trying not to rely on anything except jQuery until it's all loaded.
 ?? (browser doesn't load scripts in parallel right? I didn't think so.)
 Could do a spinning check like mwEmbed does but seriously, fuck that)
Got most of it working except we have issues enabling PLURAL in the parser
Also should examine the mw.setupMwEmbed function as it seems to do some magic to initialize mwEmbed. Already calling magicSetup, why isn't that working?
Also, why didn't my $j.ready thing work? Perhaps I need to use jQuery instead.

Aug 11-14
=========
More de-JS2-ifying 
rewriting network API

August 16
==========
catching up on various communication
Wikia meeting on RTE

August 17
=========
cutting the cords with JS2
realizing things along the way
UploadWizard
  - supposed to do: filesystem
  - instead: JS rewriting apis
RTE
  - just giving input, but also reviewing state of Closure editor etc.
ResourceLoader 
  - adding PLURALization fail
  - eval of other loaders
  - anti-global manifesto
Gave rate this article Adam Miller popup library
  - do you want evals?
+ ReviewBoard

August 18
=========
green card stuffs (summarizing, researching)
talked with Trevor about URI parser
rewrote URI parser again. Then discovered that most of mdale's URI proxy stuff is in fact not even in 
WMF repo. The whole point of parsing URIs (to determine local versus nonlocal) is entirely academic.
This is not really a productive use of my time. Fuck this for now.


August 19
=========
got uploadwizard mostly working again, with new api object
also, lack of config object (hurray)
a bit more refactoring needed there
spent time figuring out how tokens are expired -- turns out they aren't, they are session vars which expire whenever php.ini says
so. My instinct that it was "about 20 mins" turns out to be correct, 1440 seconds = 24 min.
On the cluster this will be done with memcached (to share sessions among servers) but otherwise similar.
Soooo since we can't really predict when the token is likely to expire let's just trap the error when it does and fetch a new
token transparently.

August 20
==========
Okay let's write something that does this token crap transparently.
Also, it would be nice if it worked, too.

- approaches to tokens

0 - let application code handle tokens

1 - get a token before every POST -- this may be what is supposed to happen anyway
     -- this is more bandwidth but considering that uploads are a big deal anyway, I see no problem with that
     -- however, this Api object is supposed to be useful for other things too

2 - assume that tokens expire in about 20 min, and fetch a new one only if we need it, allow errors otherwise
     -- 

3 - trap the error 'notoken', fetch a new token, and retry
     -- also possible

August 23
=========
head hurts today. Can't brain today, I have the dumb.
Just hanging around until review, going home then
Had progress meeting with Alolita & Guillaume, pushing milestone out a week
Roan and many others are here, should borrow them
TODO today
Review of other loaders
Reading closure compiler lib and the yahoo loader

Aug 24
======
meetings
working on mw.Api and mw.Uri
OMG JASMINE IS AWESOME 
writing a lot of tests

Aug 25
======
Jasmine is really awesome
this api + uri lib is kind of going to rock -- should get it into resource loader
done with testing uri lib
on to api
simplified api infrastructure somewhat, factored in to mw.Api.edit.js
how to test?

I guess we will have to have a real mediawiki instance :(

Aug 26
=======



Aug 30
=======
fuck fuck fuck fuckity fuck
now what?

Aug 31
======

mockjax, for timeouts and such works well
should add in stuff for handling the "actual" case too.
however, having difficulty clearing settings. Maybe just reset before every it(). Before params?

error handling shaping up well
however, would be better to handle it in some 'flattened' way...
...sometimes we do want a special handler for timeouts, or particular errors, so we can clean up or retry
...but more often we just want a generalized handler for all errors
...we could do this by supplying a simple error handler for every kind of error, and normalizing the api errors to look like that?
   textStatus --> general class of error  error, timeout, notmodified, apierror
   type: http_error, http_timeout, http_notmodified, [ api error codes here ]  <-- textStatus
    xhr -- the xhr object
    error -- exception type or message
    results -- json parsed results
	if error, will have an error code, and various stuffs in the body.  could go into "type"
    
are we really sure that we need customized handlers for errors in every call? I say yes, because
  - I want handlers tied to the individual upload
  - I want testability with fine-grained error handling. Should be able to pass closures that do nothing at all other than test for error
  - I want handling api errors to be as easy as handling http errors (or unified)

Okay it's pretty close to the time we should be wrapping this up

Sep 1
=====
Upgrading the UploadWizard to use the new API
- rewrote getEditToken so the error condition ( no token found ) is returned via the err() callback 
- writing a failure function 

just for fun tested whether exceptions do propagate back to caller in case of XHR -- they do not

modified api (at least token getting) in UploadWizard
now the question is how to display this to users

1) "Upload files and continue" should disappear after clicking. Or, perhaps we should do what Ryan suggests and begin uploading instantly. And then
have a throbber button?

How does the iTunes uploader do this

        bold title
Thumb   -- progressbar --    button
        status message      

When uploading, button is [ pause ]
When paused, button is [ down arrow ] ( to resume )

When paused, the progress bar is not shown, instead it shows 
    larger title and "tap to resume" is the status

To remove a download, in iTunes you swipe (convention)
we need some sort of button
can't use an x for "Failed" if I also use it for removing

discussed design with Guillaume -- will place status messages below or otherwise near field, similar to convention we use elsewhere.


Sept 2
=======
ok we have uploads and thumbnails working with new api
still have to fix a lot of other stuff, including destinationchecker...

annnnnd then we'll need to actually have PHP to handle uploads differently
We will want to make this orthogonal to the upload method -- in other words if you upload from regular post, or from chunked protocol, or 
from url, or whatever.

it seems that all upload methods obtain the file, and then do a "parent::performupload" (presumably from UploadBase) in finalize()

so, perhaps we just need to insert a hook into performupload and voila.
This thing will do everything performupload does but just not add it to the database, if that's possible.


ApiFirefoggChunkedUpload.php does this:
- overrides performUpload (because it wants to return a particular response for each API call... that seems lame)
   in performUpload, if it detects the upload is now done, calls  performUploadDone (its own thing)
	which does this:
                $this->mUpload->finalizeFile();
                $status = parent::performUpload( $this->comment, $this->pageText, $this->watchlist, $user );
	Question: why is there a mix of member methods and static methods here
		furthermore, how is parent::performUpload even understanding which file this refers to


Basic design
 - override normal performUpload so it *doesn't* add the file to the db.
 - use UploadComplete hooks to extract the usual info (mimetype etc) and return this after upload
 - create some new way of getting a thumbnail of a 'stashed' file
 
 - then, later on, perform an upload from stashed file (methods exist, just not obvious if this works from the API)
   - how to do this
     - when receiving an 'action=upload' request, eventually calls ApiUpload::selectUploadModule
     - if we include a 'sessionkey' in params, but no file, or url,
     - then will look up session data
       - may die with 'invalid-session-key' if too old or whatever
     - and then it will reinitialize that upload, and presumably complete it (?)

  - so the first question is how to stash an upload in the first place
  - the next question is how we can extract info, thumbnails

bryan tong minh created 'staged-upload' branch... did that get merged?
UploadBase::stashSession() saves a temp file and adds info to our session so we can find that file again.
ok so now, who calls stashSession

stashing happens automatically if there are warnings
( ApiUpload.php checkForWarnings() )

but wait -- how do we do multiple files in the same session?
It seems we can pass a key but I don't see any code that ever used that feature.
Here are all the things that use stashSession (may be unrelated...)
UploadBase normally just creates a random key for each stash

./extensions/MultiUpload/MultiUpload.body.php:		$this->mSessionKey = $this->stashSession();
./extensions/MultiUpload/MultiUpload.body.php:	function stashSession() {

here they save it under its own little namespace, which seems like a good idea
$_SESSION['wsUploadData_' . $this->mFileIndex][$this->mSessionKey] = array(
                        'mTempPath'       => $stash,
                        'mFileSize'       => $this->mFileSize,
                        'mSrcName'        => $this->mSrcName,
                        'mFileProps'      => $this->mFileProps,
                        'version'         => self::SESSION_VERSION,
                );


./extensions/MogileClient/SpecialUploadMogile.php:	function stashSession() {
./extensions/SemanticForms/specials/SF_UploadWindow.php:	function stashSession() {
./extensions/SemanticForms/specials/SF_UploadWindow.php:		$this->mSessionKey = $this->stashSession();
./extensions/SemanticForms/specials/SF_UploadWindow2.php:		$sessionKey = $this->mUpload->stashSession();
./extensions/SemanticForms/specials/SF_UploadWindow2.php:		$sessionKey = $this->mUpload->stashSession();
./extensions/SocialProfile/SystemGifts/SpecialSystemGiftManagerLogo.php:	function stashSession() {
./extensions/SocialProfile/SystemGifts/SpecialSystemGiftManagerLogo.php:		$this->mSessionKey = $this->stashSession();
./extensions/SocialProfile/UserGifts/SpecialGiftManagerLogo.php:	function stashSession() {
./extensions/SocialProfile/UserGifts/SpecialGiftManagerLogo.php:		$this->mSessionKey = $this->stashSession();
./includes/api/ApiUpload.php:				$sessionKey = $this->mUpload->stashSession();
./includes/specials/SpecialUpload.php:		$sessionKey = $this->mUpload->stashSession();
./includes/specials/SpecialUpload.php:		$sessionKey = $this->mUpload->stashSession();
./includes/upload/UploadBase.php:	public function stashSession( $key = null ) {
./includes/upload/UploadFromStash.php:	public function stashSession( $key = null ) {
./includes/upload/UploadFromStash.php:		return parent::stashSession();

emailing with Bryan Tong Minh

> You need to add a "force stash" mode in ApiUpload. Should not be to
> difficult; you only need to call stashSession() instead of
> performUpload() somewhere at the end of ApiUpload::execute().

> Multiple files per session is no problem.

Yup, that was my understanding, thanks for confirming.


>> - somehow obtain a thumbnail URL, perhaps by modifying how thumbnailing
>> works and giving it the ability to read these sessions.
>>
> That's more difficult. I'm not sure if Wikimedia is setup for this
> purpose, but if it is you should create a new UnregisteredLocalFile
> from the stashed file path and use that file object to generate a
> thumbnail. I don't know if this is possible, but it should be the way
> to go.

I see. So the point is to wrap it File-style interface first?


includes/filerepo/File.php 
 --> createThumb( width, height ) does the right thing it seems
returns URL

Okay that takes care of how we get it in, then how do we really enter it into the db: we use UploadFromStash.
already used in Upload -- just send an upload request with a sessionkey

e.g.   api.php?action=upload&filename=Wiki.png&sessionkey=sessionkey&ignorewarnings=1


ApiUpload calls performUpload on the member, which does whatever it does

then the member (for instance UploadFromFile) can do whatever it needs to do 
then that member calls parent::performUpload, or, it simply inherits (which in practice is in UploadBase)
FINALLY, in UploadBase::performUpload the thing is actually added to the database.

the irritating thing here is that we are now orthogonal in two ways
UploadFromX, but this is now UploadToStash (not db)

we need to intercept parent::performUpload
perhaps store a member value which represents the thing to upload into
the upload interface is actually in the File object! So perhaps we just have to create a slightly different File object. Or...


ok, the way this will work: add 'target=stash' to the URL.
ApiUpload will then look for that and 
 - stash session
 - returns: ... well normally this is a FileRepoStatus object
 -          create thumbnail and return URL (is that normal?)
 - not clean up the temp file. (this is already happening when warnings triggered?)




Sept 7
======
TODO
god dammit get the uploads working again
- do a thumbnail;
  - wrap it in an UnregisteredLocalFile
  - ask for a thumbnail url
- return image info

then commit this mofo

then do those text changes Guillaume asked for

then refactor refactor refactor to use new API
(with warnings, including dupes)

Sept 8
=======
goddamn it 
I appear to be rewriting whatever the code is that does publishing, but where is that code?
we want to check for hash collisions, generate thumbs, etc etc
that may be done in different ways in different repos?
how the hell does that even work?
FUCKING MAGNETS, HOW DO THEY WORK
okay it seems to work via the 'warnings' module


UW sync-up
------------
- schedule meeting w/Erik to get requirements for final state -- incl study
- documentation from me 
  - architecture diagram (including error handling)
  - Roan documentation  
  - code review
- ask for more time
- what to highlight in terms of what's new
-

Erik mentioned that the guy from MS india was actually directed to me by him -- wants help integrating wikibhasha.
A thing which does article assistance 
Replied, gave times for a skype talk
 
Sept 9
======
- missed appointment with MS india, scheduled it at 7am for me at 1am?!?!?
- wrote a bunch of responses to WikiBhasha people via email
- todo - determine licensing requirements, also, how do we feel about proprietary APIs
ok fired off a bunch of responses, queried Erik & Danese about GPL and MsPL.
turns out BSD or Apache is the solution

7:46
spent most of the day getting this bloody xdebug working
also spent (too much) time on this india thing
waiting for call at 8:30, if they even get this
i


WikiBhasha talk
Naren = tech lead for WikiBhasha
Kumaran = ??

wikibhasha tried per-sentence edits didn't get traction (w/Japanese)
ask them why
erik saw wikibabel 2.0 a year ago, thought UI/UX was complex
wikibhasha, latest version
bookmarklet

- translation
  - gets into Microsoft Translate dictionary, is it project specific or global
  - what tech used to build this -- open source but to what JS standard, does it work on firefox, chrome etc
  - where did you get this model of writing an article
  - usability
  - tried with wikimedians and can we contact them
in any case what help do you need from us?
  - to get released as a bookmarklet requires no intervention 
  - to release as gadget only needs consent of the from the wiki admins
  - to release as ext with PHP is a bigger deal but you guys are not doing that 
  - license?

logging ?



Sept 10
========
got xdebug working! (although it seems to be mostly the fault of the debug client, it's very unwieldy, needs to be 'restarted' a lot
also, still have the issue of it being triggered at every session -- I want it to be silent until debugger is triggered

update w/Guillaume and all
gave progress report
talked about integrating tutorial -- apparently no concept or treatment yet?
Guillaume wants to include it as step 0, Alolita & I have issues with that 
scheduled meeting with Erik & Danese to:
  - determine what we should have as end state -- the contract doesn't specify it?
  - determine whether or how to extend
    - reasons to extend
    - 1) we can deliver more or less on target. We should hit many of our goals 
    - 2) but, we would like additional time because:
		- lost a lot of time switching backend and resource loading libraries, several times, during project.
		- although we have front-loaded riskiest stuff first, 
			we are already 2 weeks behind last schedule, and things could still slip.
		- the user research suggested some of our nice to haves are really essential, 
		  such as in-upload progress, pause buttons, which we can only do with HTML5
		- generally it could use interface polish
	3) integrate tutorial? may come in may not
	zz) on a personal level, don't feel that it's really at the level of "awesome" yet
	IMO a month would be plenty

- technically we are extended already due to late start
- would prefer to drop on Oct 




back to debugging
why no thumbnail:
MediaHandler::getHandler: no handler found for .
unsurprisingly, no "handler" found, but for '', has no mime type?? ??
also I should set $wgDebugLogFile so I can get debug messages

meeting w/Alolita, what is our game plan for meeting  ( see above )

did meeting 

Okay so fixed how we are doing thumbnail call and everything
annnnd then we have an issue where the "transform()" call expects the UnregisteredLocalFile to be able to do all sorts of stuff
with its title, and also to construct URLs
we *could* call doTransform directly, and create our own thumbnail urls and whatnot
but then we wouldn't get the benefit of there being the default icon or would mean duplicating all that code
so, the correct thing is for the UnregisteredLocalFile to have a "null" repo, that returns dummy responses to all of that
including figuring out where the thumbnail should go -- it has to be accessible somehow to the web...
so in reality are we actually making a FileRepo? no, but in another sense we are
will this even work on commons then??
the other alternative is to send a URL back which can generate the thumbnails on the fly. That seems like a bad idea if the file is a video or something
Also, how are thumbnails generated today, anyway, for things like videos??? Those may take longer than 30seconds to generate thumbnails, surely
Unless we rely on it always being < 30sec

- new plan
-- return a thumb URL, mimetype, and dimensions immediately, probably underneath Special::UploadWizard
-- effectively this is a SORT of repo, but we don't have the requirements of making any URL findable. Do we?
  -- SessionRepo
-- but don't actually do the thumbnail
-- instead figure out some other way to respond ( subpage from Special::UploadWizard probably )
-- then we'll have up to 30 sec to generate thumbnail, or can do something special client-side
-- anyway, then we cat the file right out in that image URL request, and MAYBE save it to disk, much like extensions/WebStore

Sept 13
========
discussion with Bryan Tong Minh
(recorded in ~/Documents/wmf/chats)
gist:
- the LocalRepo = the stash directory = shared over NFS, at least for now.
[10:14am] Bryan:
yes, in Wikimedia, LocalRepo is an NFS share
http://noc.wikimedia.org/conf/highlight.php?file=CommonSettings.php
search for LocalRepo
and in InitialiseSettings.php
'wgUploadDirectory' => array(
    # Using upload5 since Feb 2009
    # Using upload6 since Jan 2010
    'default'      => '/mnt/upload6/$site/$lang'

Therefore we do not need to do exotic things with the regular repo.
Look in staged-uploads -- which creates an entire new db table, but otherwise extends the File object not the repo
Bryan refers me to the README in the filerepo dir, which tries to establish that the filerepo only does the minimum, and the 
File object should be supreme
Bryan agrees we don't really need a new db table
(I am also reluctant to make a new table since this is extension functionality)


sync up meeting
what's missing from me?
- arch diagram
- docs
- code review
- anything else

Explained redesign with repository and files to Alolita
also the NFS /tmp thing
on Alolita's suggestion, emailed bryan & offered trip to DC
chatted with mdale about this, he says it's okay to ping him to review some stuff 
later in the week

By accident, happened upon some #wikimedia-commons discussion about deleting FBI seal again
got mgodwin on the case, added some stuff to deletion-discussion page.

okay, now, what was Bryan Tong Minh really up to with his patch? It seems to me he was getting kind of close to a 
Repo object, after all.

He modified UploadFromStash.php so that it depended on getting the Stash from LocalRepo, and then checking what file was stored there
This seems very repo-ish to me.

Trevor offers Roan & his help next week


what is required, for fucks sake
ok we are storing stuff in $wgUploadPath/temp ?

- we need to be able to upload
  - and do the uniqueness check in the normal path space
  - also the check for unique name

- we need to be able to get a URL for a thumbnail -- generating currently in the public area, need to move to temp, and create
some sort of handler for it

Namespaces are defined in the global namespaces
or, we could say, Image:Stash:something

...problem, we don't have a repo in UnregisteredLocalFile
this causes a problem in getDescriptionUrl line 1076 of File.php, which is called in the middle of File->transform to potentially add a comment
to the thumbnail...
so we'll have to add a repo; OR; subclass UnregisteredLocalFile so it returns some kind of dummy URL, or or or?!

ok, manually adding a repo (LocalRepo) seems to have worked
it automatically created the thumbnail in images/ hash dir / 120px - ( name of random file )
however, we'll need an appropriate extension for that to work, so we should meddle with the extensions
somehow

then we'll add some iinfo

and then then then we need to use the 'upload from stash' properly (probably need to obtain the stash file in returned info)

we still have the issue that we put the icon file into public space, by using the public repo
we should place the thumbnails in temp space as well
and then, have some way of getting them that's unique to the session
wiki/SessionFile:phpinsOM5/original.jpg
wiki/SessionFile:phpinsOM5/thumb.jpg



Sept 15
========
reading about hooks, since I want to put as much code in extension as possible
oops, should I respect UploadForm:BeforeProcessing?

hooks, may want to consider: 
UploadForm:BeforeProcessing
UploadFormInitDescriptor
SpecialUploadComplete

ok now what?
let's work on getting the extension appended to the thumbnail
then let's put it somewhere NOT in the normal space
and write some way to access it using a different thumbnail url

enabled wfDebugLog just for the halibut

why can't we get the right extension: this is why
- we calculate thumbname before we do the actual transform
- we currently REQUIRE that uploaded files have a certain extension.
- so, using the temporary file (which is extensionless) is surprising.
- the system does not detect the case of an item which doesn't have an extension -- the generic scaler just assumes that extension is the same as before, and 
  tests if they are DIFFERENT, not MISSING
- this is of course obviously wrong, since empty string = empty string
- hrmrhrmrhrmrm
- what we really want is to get the mimetype of the item that we WILL have once we call transform
- or!!!
- we could check for the case of 
   - this->handler->getThumbType returns empty string
   - and we have no extension
 or, we could somehow fake the extension?

based on suggestion from Danese emailed Bryan Tong Minh suggesting a quick Friday - Sunday trip

Sept 15-16??
========
posted on Wikitech-l -- some stuff brewing about developing an extension for wikis that need to track the licenses associated with stuffs
n.b. need to understand page_props
- it is owned by the parser
- does this generate metadata that we can search on? what are page_props?
--- okay that was a waste of an hour :(

Alrighty then
the user's session -- the filename is already stashed there, nothing to do

now we need to place the file under the temp directory -- should be SIMPLE... probably the same directory would suffice
and then we need to return a URL that we can read

when we get a request for that URL, wiki/Special:UploadWizard/thumb/120px-somefile.jpg
parse the url to obtain 'somefile'
and then get the path to the file

finally we need a method that brings the stashed file into the database, with our supplied wikitext


ok it is creating in the right place and all, but, we are still removing the temp file for whatever reason

we cannot just leave the temp file there because PHP will automatically remove a file created with tmpnam
so we are going to have to create a "session repo" after all 
or, the equivalent, which was to attach a "stash" to LocalRepo as BTM did

wrote a bunch of 
okay we got the right extension, although, in retrospect, it's really only important to have that in the URL, not the filename itself. But whatever

- next we have to place this file somewhere in 'temp' and not in any other zone
 i

meeting with art director that went absolutely nowhere, no agenda, etc. etc.

Sept 16
=======
meeting w/Alolita & Guillaume
***REDACTED***

ok we have to place the file in temp

mdale mentioned that he had some way of moving files out of temp so presumably they don't get cleaned up
...
Firefogg extension is the new resting place for this stuff, by hexmode
ok so each individual "upload" is really just a chunk, so they override performUpload()
repoPath is saved in the session itself
and generated lazily on first upload chunk
have to watch for session version # (could theoretically change in between sessions)
self::session_keyname is defined in UploadBase ('wsUploadData') -- getSessionKeyname() also there
just a prefix for our session data as compared to anyone elses
repoPath is created on first appendChunk()
by using saveTempUploadedFile from UploadBase

...



meanwhile...
how does uploadFromStash really work
this is invoked whenever an upload "temporarily" fails, so perhaps we should emulate that.
if an upload fails for some recoverable reason, then
stashSession is called
which ALREADY calls saveTempUploadedFile... we should figure out where that's really going so we can watch it
and then the warning is returned
and then one can upload it again, by using a sessionkey parameter to an action=upload API call
...

okay so we have a problem now
we need a way to optionally stash it to an ARRAY of files, not just one per session
so, let's modify stashSession for a preferred stashkey?
let's call it stashkey =  (something client side)
then stashSession() will put it into a key-val thingy

okay we have a very pretty looking SessionStash object (which is almost general enough to use in FirefoggChunkedUpload too)
no other extensions in trunk are actually inheriting from UploadBase, they are reimplementing
so it's reasonable to start patching behaviour in UploadBase too


Sept 17
=======
ok does this crap even work
not recognizing 'stash' param...


Sept 20
=======
why the fuck is this not fucking working



October 4th monday in two weeks
put bastet on prototype
October 6th announcement

- Milestone C
- blocker assistance?
- diagram of architecture
October 24th 

Sept 21
=======
most of morning taken up at doctor's, in at 2pm
difficulty committing
roan / alolita suggests commit to a branch
ok I appear to have this crap semi-working now, returns the right url
now we need to write the Special page which accesses it and cats out appropriate info


- risk: getting a real test with similar filesystem -- ask roan & mdale to look at code
  -- get good description of commons from Mark Bergsma Bryan TM tso can mdale

- Roan is going to be available for code review on Monday onwards
- Crossing off bugzillas -- update tasklist (google docs)
up till 3am

Sept 22
===========
branching committing merging 

Sept 23
=======
bugfixes
okay for some reason {{PLURAL}} does not work after all those merges
oddly mdale and I track down every other source of bugs and everything seems to be identical including jquery
we can even get messages to parse in one window (running UploadWizard) and break in another (running MWEmbed branch)
(much debugging later)
the reason appears to be jquery which adds a "compare" to the array object (?)
mw.Parser line 213 uses a "for var i in node['child']" which gets the 0 then "compare"
but I thought the jQuery impls were the same??
WAIT WTF WHY IS THE PARSER IMPL DIFFERENT TOO?
THIS IS RETARDED
I THOUGHT I CHECKED ALL THESE DIFFS?
ok the parser is implemented differently but just for convenience methods
nevertheless the line that breaks is the same in both cases

in any case,
***REDACTED***

it replaces all $n vars globally, whatever
but then!!
to parse the templates
it parses it char by char
when {{ encountered, drops into child -- good
then creates a template node, which is parsed in really complex ways
and THEN, adds the template code right back into the parent node
and when it evals the child, does a search and replace on the parent node with the child -- relying on the idea that PLURAL:2|foo|bar is unique in the parent string
WTF. Does not even record offsets, let alone tokenize
ALSO, the template parsing could result in amazing weirdness if the template returns stuff meaningful to parser like }}

ok, in the one that works, there is no "compare" in the Array
strange...

MWEmbed is using jquery v1.4.2
same as UploadWizard

in the resource loader mediawiki, it adds Array.prototype.compare if there is none
decision: for var in X is bad
eliminate it from parser


Sept 24
=======
fixing var in x issues
Bryan Tong Minh's statuskey thing is broken (for me) -- he optionally creates statuskey only if a global is present
-- the parameter should not be conditionally defined, only conditionally accepted

committed misc fixes from the comments 

changed around the return from Session stash file transforms; we return a file object in the ThumbnailImage now, so
the caller can get file info and metadata.

woot, the regular file info stuff is now working (simulating ii API in return)
now, trying to extract metadata as well -- returning null?

installed exiv2 so we can manipulate and read exif on command line

debugging
JPEGs are handled by the generic BitmapHandler
PNGs are handled by PNGHandler...

hokay let's debug BitmapHandler
seems to be controlled by wgShowEXIF
change it in LocalSettings??

okay, uploaded a JPEG which definitely had EXIF data and
it explodorated
Fatal error: Call to undefined function exif_read_data() in /Users/neilk/Sites/wiki/includes/Exif.php on line 299
Perhaps this ports php install doesn't have Exif support :(
crappity crap crap, no exif support in the ports php
I guess I am going to compile my own :(

grrr. talk to bawolff see if there are modern ways of doing metadata

- maybe ignore this issue for the time being
- also, might be good to include a parameter for selecting the image props to be returned.

strategy for getting this in 
-- inherit from UploadFromFile -- make an entirely new module UploadFromFileToStash
-- with new method names
-- and register them later
Bryan's new upload stuff is making it harder to refactor this
will suggest moving the upload statuskey elsewhere -- it should not be in the upload method itself, as it is just getting the status of an upload


Sept 27
========

Allrighty then -- hookified most of this
okay we did not actually trigger the right upload?
where's the thumbnail?
TODO -- get an apache that works with metadata!!!
-- trevor mentioned details on how to do that are in new employee orientation
recompiled php
now we have to 
- ensure that the opt/local apache gets this php
- or, otherwise switch back to the apple-standard apache
It looks like the apple-standard apache got this php, by default
sooooo
we should um um um 
-- copy the php config from /opt (/private/etc ??)
-- re-enable the standard apache at startup (/usr/lib)
------------------------------------------------------------------------------------------------------------------------
/usr/share/httpd/build/instdso.sh SH_LIBTOOL='/usr/share/apr-1/build-1/libtool' libs/libphp5.so /usr/libexec/apache2
/usr/share/apr-1/build-1/libtool --mode=install cp libs/libphp5.so /usr/libexec/apache2/
cp libs/libphp5.so /usr/libexec/apache2/libphp5.so
Warning!  dlname not found in /usr/libexec/apache2/libphp5.so.
Assuming installing a .so rather than a libtool archive.
chmod 755 /usr/libexec/apache2/libphp5.so
[activating module `php5' in /private/etc/apache2/httpd.conf]
Installing PHP CLI binary:        /usr/bin/
Installing PHP CLI man page:      /usr/share/man/man1/
Installing shared extensions:     /usr/lib/php/extensions/no-debug-non-zts-20090626/
Installing build environment:     /usr/lib/php/build/
Installing header files:          /usr/include/php/
Installing helper programs:       /usr/bin/
  program: phpize
  program: php-config
Installing man pages:             /usr/share/man/man1/
  page: phpize.1
  page: php-config.1
Installing PEAR environment:      /usr/lib/php/
[PEAR] Archive_Tar    - upgraded:  1.3.7
[PEAR] Console_Getopt - already installed: 1.2.3
[PEAR] Structures_Graph- upgraded:  1.0.3
[PEAR] XML_Util       - already installed: 1.2.1
[PEAR] PEAR           - upgraded:  1.9.1
Warning! a PEAR user config file already exists from a previous PEAR installation at '/Users/neilk/.pearrc'. You may probably want to remove it.
Wrote PEAR system config file at: /private/etc/pear.conf
You may want to add: /usr/lib/php to your php.ini include_path
/Users/neilk/Documents/wmf/php-5.3.3/build/shtool install -c ext/phar/phar.phar /usr/bin
ln -s -f /usr/bin/phar.phar /usr/bin/phar
Installing PDO headers:          /usr/include/php/ext/pdo/
------------------------------------------------------------------------------------------------------------------------
removed the LaunchItem for opt/local, and re-enabled the usual HTTP sharing.
EVERYTHING WENT BETTER THAN EXPECTED.

Okay now it all sort of works, with core modifications
moved some libs into core, too.
now I need to
-- add unit tests
-- add the Special:SessionStash page
-- write some doco and request review from Roan
-- rewrite how the bloody Javascript parts work


Sep 28
=========
Quick impl of SpecialSessionStash.
wait... why is the special page not working?
ok we had to add an alias, and also, the entry in SpecialPages.php was wrong -- the first param is a classname, not some general "hi this is a specialpage"
all righty then why is this not executing correctly

1) need to do these exceptions better -- make subtypes perhaps
2) we may be inadvertently creating these files on 404. Absolutely must be the case that nothing is created on getFile()

uh oh chrome does not like downloads for some reason :(
-- no that wasn't the reason
-- 

took some comments from nikerabbit, fixed

Let's write a unit test, to see that the whole thing hangs together?


Oct 1
=====
Investigated how all that unit test stuff works
best example is hexmode's testing in Firefogg extension phpunit test
can simulate requests + uploads by manipulating files
confirmed with Trevor however that these tests referred to obsolete or idiosyncratic directory structure.
wrote puf script for my own usage to run tests.
refactored createRandomImages into library + script so I can generate files in the program.

Oct 2
=====
figured out the reason why chrome doesn't get the image
the SessionStash is tied to the session, not the user.
sessions are DISTINCT from logins, and are unique only to the browser + browsing session.
I guess that's sort of a good thing as some wikis allow anon uploads.
can "hijack" merely by manipulating declared session; or, possibly by using PHPSESSID in URL?

bugs
- not logged in - try to get session file -- cannot retrieve session, backtrace, with logo

Oct 4
======
- Christ on a cracker, we're supposed to drop Bastet on prototype today
asked Trev for review today :(

Roan prefers private vars with accessor for immutable data



Meeting -- Guillame & Myself, Alolita en route from France
- Discussed state of project -- looks like 2 weeks more for Bastet, really
- could deploy to prototype but only for internal usage
- obviously blog post a bad idea, right now.

Arranged code review with Trevor


Oct 5
======
meeting with alolita
todo -- weekly or more images to prototype 
- Guillaume to turn comments into bugs
bugs getting the goddamn last things out of the API


Oct 6
======
meeting with artist, set some params for his work -- looks good
talked about internationalizing, SVGs and such
did major refactor of how API is working -- am now taking over all session-stashing, which means I will need to write tests for the methods
I'm overriding.
okay, why am I not getting the params I want?
/Users/neilk/Sites/wiki/extensions/UploadWizard/ApiQueryStashImageInfo.php:91 - wfDebugCallstack
/Users/neilk/Sites/wiki/includes/api/ApiBase.php:461 - getAllowedParams
/Users/neilk/Sites/wiki/includes/api/ApiBase.php:499 - getFinalParams
/Users/neilk/Sites/wiki/includes/api/ApiQuery.php:257 - extractRequestParams
HA, VICTORY!!!
okay, at least the API call is returning the right url, just need to wire that up
ugh all kinds of weirdness in ApiUpload, transformWarnings, when did that happen?!?!

Oct 7
=====
Push through with the rejiggering of UploadWizard.js so I can release to prototype
 - need to get the upload from stash working DONE!
 - now need to get the whole title / naming issue sorted out.
Okay so what *should* happen?
  There is:
    - no title is needed when we send it up to stash
    - the original filename
    - the human readable title edited in 
    - the title we use for destinationcheck
    - the title we request in the final API call to upload
	- just a question, why is our last API call going through the iframe? we have to do this for IE or something??
    - the File:title_thing.jpg  for the URL

Wikimedia terminology: filename is Foo_bar.jpg

MEDIAWIKI'S TERMINOLOGY
Title = "File:Foo_bar.jpg"   <-- also called "prefixed database form"
  - namespace = "File"
  - main = "Foo_bar.jpg" 
    - text = "Foo bar.jpg" (human readable)
     (own terminology now)
     - human = "Foo bar"
     - extension "jpg"

prefixedText = File:Foo bar.jpg <-- typically used for display 
prefix = namespace = "File"


Oct 8
=======

Schedule as we know it
========================

October 8th
Prototype: code drop

front end improvements

October 15th
(get eyeballs on prototype)
Village pump
Blog post
Signposts

(dark launch on commons)

Oct 22nd 
code drop of prototype

Oct ? blog posts?

Oct 26th
THE END OF OCTOBER

Oct 27th - Oct 31st


Oct 26 OR Nov 1?
launch on Commons with link from existing commons form
possibly, watchlist notice?

Oct 26 --> Nov 1 tech blog post


========================================


solving naming issue with a 'Title' class
unfortunately that broke uploading ;)
need to fix. :(


then I need to check if this bloody result code works and if warnings work

fuck fuck fuck
okay, for whatever reason, the title fieldname no worky.

ok this sort of works except for the very end -- the descriptionURL is using the old value of the filename

TODO
----> we got a new descriptionurl in the final upload from stash, we should update to that.
----> <b>Parse error</b>:  syntax error, unexpected T_PAAMAYIM_NEKUDOTAYIM
in
<b>/srv/org/wikimedia/prototype/wikis/branches-uploadwizard/includes/upload/SessionStash.php</b>
on line <b>177</b><br />

now -- sessionstashnotavailable??

deal with failures better, too.

- original filename 	bar/baz/foo_bar.jpg

Then, let's circle back and write some unit tests for uploads of all kinds in trunk, merge them into our branch

Oct 12
=======
convert call is not working
hurr, errors are redirected to stdout, can we capture these??
-- problem 1: no comment leads to the shell call not working. fine put in a dummy comment?
  here's the problem
   - lack of descriptionUrl 
   - leads to empty arg
   - wfEscapeShellArg does the wrong thing with empty args :(
   - sooooo.... we should just have a dummy descriptionUrl, I guess? Or use its own url as descriptionUrl.
-- problem 2: can't write to the appropriate dir??
  not a problem -- it's just the user I am interactively logged in as
  

ok, now we have a problem with some newly defined constant in GlobalUsageHooks -- Title::GAID_FOR_UPDATE
this has nothing to do with us, it's just that we're out of sync with trunk, or something...
why doesn't it fail on local?
r73549 is the moment we diverged

Oct 13
=======
allrighty the next thing
docs for wikitech

- make errors work the way we always wanted
 ... ok it really seems that the standard way to die is dieUsage, and even if multiple errors are detected, to use the "first".
  ... this is LAME
 .... but we can't fix it all today.


Documentation
--------------

Oct 18
======


Oct 19
======
... do the language thing for RL??
... or at least fix it up so it works
okay did a lot of merging, and then touchups from that
there is something broken with concatStyles(), waiting for trevor to fix

in the meantime

Write those goddamn unit tests already
...if they don't already exist

ok let's back up here
-- I made an alias called "puf" to run unit tests, but it complains saying the globals probably aren't set right, need to use their script 
 invoked with a Makefile (make destructive, since the tests are destructive).
-- tried to run them the way they suggested, failed since the PHPUnit libs weren't in php path
-- uncertain if there are any upload tests that currently work, and what is broken about them

Oct 20
======
but first, let's get the JS language working again
actually, mdale is working on this... or was...?

ok maybe unit tests are overkill, or maybe not...
let's just prove to ourselves that we haven't broken regular uploads
- and, at the same time, bring back the "standard" upload form...
ok, let's try writing an Upload api test...
alrightythen step 1 is to make ANY api test pass
this kind of invocation works -- don't ask
neilk@ivy:~/Sites/wiki/maintenance/tests/phpunit$ /usr/bin/php phpunit.php --configuration suite.xml --filter ApiTest

$ /usr/bin/php phpunit.php --configuration suite.xml --filter ApiTest
some of these pass, some are skipped -- are any critical??
then we'll add a simple ApiUpload test...

Oct 25
======
ok the test may MAY be working...
we need to add checks for md5 hash (remove it first) and then double check it is actually uploaded by getting the URL, downloading, comparing.


Nov 1
=====
a shortish day due to geoff call
DID YOU KNOW?!?!? our deploy branch is from february 2010.


Nov 2
=====
merge merge merge

Nov 3
=====
SF giants parade / riot-in-the-making
okat let's cross some things off

- get the CSS working
-- were we missing CSS for jquery ui perhaps?
DONE
prototype updated
for parallelization, working to get a mockup of the tutorial in
will need to grab it from commons, so we assume that InstantCommons is available? and how to get the full-size url? Can this be done in PHP-land if we 
move the "steps" HTML there?

Nov 4
=====
ok, got the tutorial step working (is it truly necessary to have a secondary "steps" array? wondering...)
problem: the moveFileInputToCover thing breaks
why
  presumable this happens because we obtain the position before (or after) the image has loaded, pushing it down a lot
it seems to be defining it twice, let's see what's going on
ultimately this is kind of annoying, rather than moving file input to cover, we should just create it in the right place which doesn't move
however, we didn't do that...
can we just have this be relative to something else?

THOUGHT
or maybe it would be easier to have them all be separate PHP-generated pages... this is possible since now we have an external store of info, we
could just keep adding metadata to the files in UploadStash.
  ...we could obtain all the info about uploads with some new API query (although ApiQueryStashImageInfo works...) or!! dump it info 
  ...then to add info to each upload we just POST, and then add it to the UploadStash item, and read it out again
   ...that means each "next" page is the target for the previous one's data. Not a problem
  ...the release rights macro page and the describe page do have to be linked somehow (or, the rights are just hidden, and you can click to expand)
  ...the describe page and use page also have to be linked, but that can be as simple as showing you files you just published.
  ...could you randomly click to different states? No, we could check if you had done "rights" and "describe" for all extant uploads
  ...then we could/should clear these out of your session
sigh

ok, moveFileInputToCover is being called twice, once from
createInterface() -> newUpload() 
and then again from the closure in moveToStep( 'tutorial' ... )
well duh, I guess we'll just move that closure!
ok that's fixed
now we need to get the file from commons itself
create a thumbnailimage I guess and assume that this is commons or that instantcommons is on?




Remember to ask Alolita about Calcey




and then we'll bring it all back into trunk
then it's JS error subsystems
then bring back Chrome, IE6 etc

Nov 5
=====
finished de-resource-loader-ifying everything
but we still need minified resources, working on a makefile

Nov 6
=====
working on adding the tutorial, did it in JS but it's hella slow
did a code review of this new guy (Russ) that Danese wanted to contract for a Media storage 
he's v.good unix systems dude but the web is clearly a bit of a foreign platform for him
Ariel & Danese ask me to pair with that guy, start developing an architecture for scalable storage on Commons.
scary, but I have the connections, Flickr & others. 


Nov 7 (sunday)
=====
- committed the new apiqueryimageinfo
- committed the new PHP createInterface stuff, created tutorial on PHP side and it's hella fast
- rather than use a makefile, I'm using a PHP class to define dependencies in a way that will work for RL and for this new hacky loader I just added
- also moved minification to a PHP class rather than using a bunch of random sed scripts and the jslint binary, we use php regexes and the php jslint library (which has Tim's mods anyway)

Nov 8
======
- Ariel asked if I'd made contact with Flickr et al... wants to move fast!
- first reply to Calcey -- 
  need to develop test plan further, their skeleton looks surprisingly good though

- squash all the bugs so far reported, or note they have been squashed
  - trivial bugs
  - error handling bugs
  - session requirement bugs
- code drop 
- generate test plan for Calcey / Mangala


Nov 9
=====
fixed some stuff with the tut, particularly i18n
meetings etc

Nov 10
======
ack got to prep for external testing
- write test plans
- sever instantcommons -- why did we have this in the first place...? 
 - o_O ugh, we do need this for the tutorials
 - then create "purge everything" button?
- get them some media files - jpg, gif, ogg
- for verification we also need to have the templates right (what happened to them?)

- loadingSpinner class
- ERROR CLASS GODDAMN IT

Alolita: new razor = does this block deployment

historyeraserbutton
- purger for tests
- perhaps simplest thing is to make them all admins, and then have the script only be runnable by admins.
- ask them to log in and give us user names

Nov 11
======
coughing up lung, sleeping

Nov 12
======
FILMING IN THE OFFICE OMG FUCK EVERYTHING ABOUT THIS

todo today

flesh out test plan

forget about purging, just tell them to use new images (for now...)

code code code 

get with Calcey on Wed, Thu...?

11am list of bugs for Alolita -- look for showstoppers only
review test plan with Alolita

Nov 15
======
Calcey is asking NOW WHAT?

didn't get much anything really done on the weekend

let's fix some crap

fixed the period bug
let's see if that Special:SpecialPages bug is still open

numerous small fixes today
fixed complicated bug with getting image info

consolidating a large patch for Roan
76581 -- doxygen warning changes

previously merged in Roan's patch (r76014) which merged in r75906
-----------------------------------
includes/upload/UploadBase.php
includes/upload/UploadStash.php
includes/upload/UploadFromFile.php
includes/filerepo/File.php
includes/api/ApiQueryImageInfo.php
includes/api/ApiUpload.php
includes/AutoLoader.php
includes/specials/SpecialUploadStash.php
includes/SpecialPage.php

So, we need to get changes to most of those files...
plus my changes...

r75907 -> 76784
new files

includes/api/ApiQueryStashImageInfo.php
includes/api/ApiQuery.php

made a patch but haven't tested it yet


Nov 16
======
reviewed patch w/Roan this morning
getting ready to test live

will repurpose local interface to submit to api there
- test before 
- test w/api.php documentation
- test after


TODO -- figure out holiday


the success order:
mw.IframeTransport.prototype completion
-> processIframeResult()
gets result, passes it to apiUploadHandler transported call
--> this.upload.setTransported( result );
  -- triggers "transportedEvent" on the ui div (changes it to OK)
  -- if ( ok )
    -- else ( stashed with errors )
   -- else ERROR
       puts up alert (lame)
       this.ui.showFailed()  (as opposed to trigger, lame??)
      alerts()
     TODO tag upload as failed

it's all in "setTransported"
okay so we ought to add some way to show the errors.

we need a concept of completed != transported
ok so
   when it returns, set completion to true
   and the status may be transported, or errorFatal, or errorRecoverable

if any errorFatal, note that and show continue button
  on continue, remove them in an obvious way

if there are any errorRecoverable, ask to try them again or continue

if all success, just say great, continue!

okay step 1 -- make the continue step different from upload
make arrow-y button style

Nov 21
======
make the status updates have real messages
- mwe-upwiz-transport-started

DONE make the OK button look right - with right hue (or match "OK" hue)
DONE when upload transport is finished and all is well, remove the file ctrls
DONE fix the progress spinner -- shouldn't have message at all
DONE remove the "OK" from the checkmark.
DONE fix the upload counts, why not updating? -- because they're looking for a different state
DONE but perhaps should be more general -- add a get thumbnail mode right after finished uploading... 
DONE (finished uploading...) is probably obsolete
DONE fix the difference in widths or something between the OK and progress spinners now.
DONE When files complete:
  - GroupProgressBar bar should fade away
  - does it make sense to say '4 of 4 files *uploaded*' ? what else do you say? 3 of 4?
     - distinguish between "uploaded" and "error" conditions
     - the progress bar should respond to any completed state
     - the 4/4 files thing should count only non-error states
DONE-ish Make errors say error, and why.
   typical error:
   iframe:json::{ "servedby": "ivy.local", "error": { "code": "verification-error", "info": "This file did not pass file verification", "details": [ "filetype-mime-mismatch" ] } } 
	are these info messages already localized? I doubt it...
	nope


Nov 22
======

Got the buttons looking like buttons.
Okay, so now, we want startUploads to yield control back to its closure once all uploads in a resting state.
oddly, this only occurs if all of them fail. Why?

Then we can add interface which says:

                                                       All uploads were successful! [ Continue ]

                               Some uploads failed. [ Retry failed uploads ] [ Continue anyway ] 
       
			           None of the uploads were successful. [ Retry failed uploads ] 



DONE - separate upload files & continue
DONE - Continue button
DONE - try to recover failed files

DONE - style the next buttons everywhere...

DONE / FIXED Tried to deploy this on uwd, getting problems due to non-existence of window.mw

DONE - fix that last issue with the URL of the image

issues with trev's window.mediaWiki window.mw mw fixes
aliased them all, just gave up



Nov 23
======
... getting a lot of reports about uploads hanging. Can't replicate any of them




- Aborting the only upload should not be equivalent to success. ;)

- file extension problem should pop open a dialog


- progress bar is now empty -- need outline & fill for the bar -- but it works fine on uwd where theming is slightly different
( did we load progress bar css? )



Nov 24
======
deploy day

we have an issue with convert -- it's not installed on our servers
so, the theory now is to do this
-- the temp files are shared in NFS between us and scalers
-- Roan is writing a handler and squid config to ensure that the scalers can read temp files and generate thumbs.
-- then we write a curl handler to call a URL like this:

http://upload.wikimedia.org/wikipedia/test/thumb/temp/e/e6/20101124183211!phpO2cZaD.jpg/120px-20101124183211!phpO2cZaD.jpg

http://upload.wikimedia.org/wikipedia/test/thumb/temp/5/5b/20101125022040!phpU2ftWO.jpg/32px-20101125022040!phpU2ftWO.jpg

and then we have two options
-- cat the file out right to the user, if this is in the UploadStash url handler
-- or, now that we know the file is on the filesystem, proceed normally


Nov 29
======
all right so can we figure out thumb.php so we don't ruin everything
and can test locally?

ok we can't actually use Roan's hack to thumb.php in trunk, as it uses some evil LocalRepo hacking
so instead
we do this:
- transform() only returns URLs. It always returns Special:UploadStash style URLs.
- in SpecialUploadStash, we have a flag / config which enables different streaming backends.
   1) create locally with actual scaling
   2) use a "remote" service to create them (which relies on NFS paths to get the file there, but we DON'T rely on NFS to obtain the thumb and stream it out -- that's all HTTP)

okay coded that up --

seeing if it works with "local" system
now we aren't getting any thumbnail info at all
investigating
we can't retrieve the original file either
the stash is still stashing sans extension -- fix that, might be the problem
also we seem to be storing under the public url, we want to store locally under local name.

Discussed MediaStorage requirements w/ Ariel, Mark Bergsma

This seems to all work now!

Okay, next step is to port those changes to uwd

Did a crapload of screenshots for the blog post and other publicity

Nov 30
======
1:00am deploy (since this is the most convenient time for Roan in CET)

Reviewed changes, some issue on the use of \Q \E, otherwise he was okay with it all

Added the last minute change Guillaume asked for, to add all uploads to the hidden category Uploaded_by_UploadWizard, for tracking purposes.

Some issues just merging everything over, but once together it worked reasonably well

Got it working on testwiki

mysterious issues with failures when deployed live
Roan determined that the $_SESSION was losing records of certain files -- we seem to have race conditions
Changing the maxSimultaneousUploads config variable down to 1 fixed it, at the cost of slowing down uploads :(

4:26am calling this deployed, wrote email to the C-levels.

10:00am up and answering emails, checking new bug reports. Lots of dupes to issues we already know. One unusual issue with "permissiondenied".
I also saw this when trying to upload larger files around 5:30am. Why?

Reviewing bugs

usual tuesday features meeting, phoned in
came in at 1:00pm

my god, uploading things to commons is almost (ALMOST) not a pain now. Just makes me want to hurt my head rather than commit suicide. 
Lack of a bulk / batch interface is very apparent though, especially when uploading related stuffs.

The flakiness/bugs still make this feel very unstable though. Have to try several times to get 14 screenshots uploaded, properly categorized, described. It's not easy yet.

did not meet with Danese re: contract, yet. tomorrow?
Alolita says all are agreed I should continue to work on the bugfix release of UploadWizard primarily.