Enable CORS for getting page piles

#5 Merged at 8133818
Repository
lucaswerkmeister
Branch
cors
Repository
magnusmanske
Branch
master
Author
  1. Lucas Werkmeister
Reviewers
Description

This allows client-side gadgets and user scripts to access page piles.


Specific use-case: ACDC (see discussion).

Comments (3)

  1. Lucas Werkmeister author

    I just realized that this change alone is not enough to fulfill the use case we were discussing – it should be enough to make AC/DC consume page piles, but not to make other gadgets (VisualFileChange, Cat-a-lot, …) produce them. I’m not sure how to do that securely, to be honest.

    That said… given that pile creation currently doesn’t require a POST request, let alone any kind of CSRF token, and I could therefore cause the creation of loads of page piles by putting something like

    span {
      background: url(https://tools.wmflabs.org/pagepile/api.php?action=create_pile_with_data&wiki=commonswiki&data=Main_Page)
    }
    

    into MediaWiki:common.css (or any other website I control), I'm not sure if this is something you're even concerned about. So perhaps it's enough to add a header Access-Control-Allow-Origin: $origin to any responses where the request origin is a Wikimedia site? (I haven’t thought this through yet.)

  2. Lucas Werkmeister author

    Thanks for merging! But for some reason it’s not working…

    $ curl -I 'https://tools.wmflabs.org/pagepile/api.php?action=get_data&id=25421&format=json'
    HTTP/2 200 
    server: nginx/1.13.6
    date: Sat, 17 Aug 2019 18:18:01 GMT
    content-type: application/json; charset=utf-8
    set-cookie: pagepile=9avbqm5qh4muv4rmgfv30svli5; path=/pagepile
    expires: Thu, 19 Nov 1981 08:52:00 GMT
    cache-control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
    pragma: no-cache
    strict-transport-security: max-age=86400
    x-clacks-overhead: GNU Terry Pratchett
    content-security-policy-report-only: default-src 'self' 'unsafe-eval' 'unsafe-inline' blob: data: filesystem: mediastream: wikibooks.org *.wikibooks.org wikidata.org *.wikidata.org wikimedia.org *.wikimedia.org wikinews.org *.wikinews.org wikipedia.org *.wikipedia.org wikiquote.org *.wikiquote.org wikisource.org *.wikisource.org wikiversity.org *.wikiversity.org wikivoyage.org *.wikivoyage.org wiktionary.org *.wiktionary.org *.wmflabs.org wikimediafoundation.org mediawiki.org *.mediawiki.org wss://tools.wmflabs.org; report-uri https://tools.wmflabs.org/csp-report/collect;
    

    I can see that the code is deployed (grep Access-Control-Allow-Origin ~tools.pagepile/public_html/api.php), but the header isn't there. Any ideas why? Locally (under Apache, with some temporary hacks to work around missing require_once and get_request) it seems to work, and I can successfully set the header in Wikidata Lexeme Forms (curl -I https://tools.wmflabs.org/lexeme-forms/api/v1/template/english-noun), so I don’t think the front-end proxy (nginx, right?) strips it out. But perhaps lighttpd doesn’t allow it?