InstantCommons broken by switch to HTTPS
OpenPublic

Description

InstantCommons has been broken by the switch to HTTPS-only mode - see e.g. beta enwiki. Fixed in master, needs backport to supported branches. HTTPS redirect has been temporarily disabled on Commons to give users time to upgrade.

Patch for people running older versions of MediaWiki: 8517b3

Alternatively, you can put this code snippet in your LocalSettings.php:

$wgUseInstantCommons = false;
$wgForeignFileRepos[] = array(
	'class' => 'ForeignAPIRepo',
	'name' => 'wikimediacommons',
	'apibase' => 'https://commons.wikimedia.org/w/api.php',
	'hashLevels' => 2,
	'fetchDescription' => true,
	'descriptionCacheExpiry' => 43200,
	'apiThumbCacheExpiry' => 86400,
);

If that does not help, the root certificate bundle of your server might be missing the certificate authority used by Wikimedia (GlobalSign), in which case it is probably badly outdated and you should update it. If you have shell access, you can check with this command (look at the "Server certificate" block):

curl -vso /dev/null 'https://upload.wikimedia.org/wikipedia/commons/7/70/Example.png' && echo success || echo failed

Older changes are hidden. Show older changes.
gerritbot added a comment.Via ConduitJun 17 2015, 6:45 PM

Change 218819 merged by jenkins-bot:
Make $wgUploadPath for commons https only for benefit instant commons

https://gerrit.wikimedia.org/r/218819

TheDJ added a subscriber: TheDJ.Via WebJun 17 2015, 10:32 PM
gerritbot added a comment.Via ConduitJun 17 2015, 11:24 PM

Change 219094 had a related patch set uploaded (by BBlack):
HTTPS: remove MW-UA redirect exception for upload.wm.o

https://gerrit.wikimedia.org/r/219094

gerritbot added a comment.Via ConduitJun 18 2015, 1:19 AM

Change 219094 merged by BBlack:
HTTPS: remove MW-UA redirect exception for upload.wm.o

https://gerrit.wikimedia.org/r/219094

Tgr added a comment.Via WebJun 19 2015, 1:53 AM

HTTPS redirect for upload.wikimedia.org breaks images for some people who have old root certs - just got a report on IRC. (Strangely, disabling the thumbnail cache did not fix this.) Made worse by lack of logging - T103043

BBlack added a comment.Via WebJun 19 2015, 3:20 AM

Yeah, I don't know that we can do anything about the "old certs" issue. It probably means that whatever underlying SSL cert-store/client-lib the client uses is severely out of date. I think we had some talk on this earlier (maybe just on irc), that it mostly affected some older Windows installations? In any case, the root cert we use is compatible a cross a broad range of browsers and SSL library/cert installations in general, even back to things like IE8 on WinXP (technically IE6 even in cert terms, but that's broken due to SSLv3). I think it's fair to blame the client end and ask them to fix/upgrade if they don't have the GlobalSign root at this point.

Bawolff added a comment.Via WebJun 19 2015, 6:09 AM

PHP on windows is probably a likely offender.

Anyways, php has a config option to point to a root CA store (assuming you are using libcurl, which we normally due if that's available), so its fairly easy to just tell people to download the new root CA store and/or upgrade php.

Bawolff added a comment.Via WebJun 19 2015, 6:33 AM

Alternatively, we could bundle https://raw.githubusercontent.com/bagder/ca-bundle/master/ca-bundle.crt (or even just GlobalSign) and set caInfo option of our HTTP wrapper ourselves.

faidon added a comment.Via WebJun 19 2015, 2:01 PM

Alternatively, we could bundle https://raw.githubusercontent.com/bagder/ca-bundle/master/ca-bundle.crt (or even just GlobalSign) and set caInfo option of our HTTP wrapper ourselves.

No, don't do that. Maintaining a CA store is serious business and we don't to be in that business (even in a role of republishing another person's CA store). Just tell people to trust their OS' CA store, anything else is just insecure.

Tgr added a comment.Via WebJun 19 2015, 7:17 PM

Just tell people to trust their OS' CA store, anything else is just insecure.

Telling people to trust their built-in CA store when it does not work with Commons is not exactly helpful. I agree we shouldn't mess with the root CA list by default, but we need a drop-in fix for people who are on old/weird OSes and want to get InstantCommons working without taking a course in system administration. Manually changing the root CAs of your OS can be quite difficult.

faidon added a comment.Via WebJun 19 2015, 10:05 PM

Our CA for production is GlobalSign. It is one of the big (in terms of websites using it) and oldest CAs. The top-most root certificate of our chain was issued in 1998 and has been in operating systems and browsers for at least 15 years now. GlobalSign has a datasheet detailing their compatibility (ignore the first "Extended Validation" section; we don't use that). The compatibility includes browsers such as MSIE 5.01+, all versions ever released of Firefox, Google Chrome and Safari (not to mention Netscape Communicator 4.51+!) plus all versions of mobile phone browsers ever since Windows CE 4.0. I would very much like to hear in which CA stores is this actually a problem and for how many users.

Moreover, a CA store that is so old probably has at least a few compromised CAs (e.g. DigiNotar) and those users should definitely take a course in system administration or consult their sysadmin.

Finally, this is not even end-users, these are users *running MediaWiki*. To draw a comparison, the CA has been in operating systems' CA stores since the time PHP 4 was new and was all the rage :) Surely we have to draw the line of where the oldest base system we support running MediaWiki on is and that line is certainly not a system running PHP 4. Right? :)

Tgr edited the task description. (Show Details)Via WebJun 19 2015, 11:22 PM

@faidon: fair enough (although https://www.ssllabs.com/ssltest/analyze.html?d=en.wikipedia.com claims IE6 is not compatible).

BBlack added a comment.Via WebJun 19 2015, 11:32 PM

@Tgr that's because we don't speak the SSLv3 protocol anymore, because of the POODLE attack in late 2014. If it weren't for that, IE6 would be compatible on a certificate level.

hashar added a comment.Via WebJun 20 2015, 9:03 AM

[snip] we need a drop-in fix for people who are on old/weird OSes and want to get InstantCommons working without taking a course in system administration. Manually changing the root CAs of your OS can be quite difficult.

No we should not bother trying to keep MediaWiki running on platforms that are totally outdated. In the case one wants to benefit from InstantCommons / interacts with our sites with SSL, he will have to upgrade. Period.

As for IE6, it is all hopeless and MicroSoft campaigned heavily to phase it out (ex: http://browseryoulovedtohate.com/ ).

Nemo_bis added a subscriber: Nemo_bis.Via WebJun 21 2015, 6:11 AM
Nemo_bis added a comment.Via WebJun 21 2015, 7:24 AM

Surely we have to draw the line of where the oldest base system we support running MediaWiki on is and that line is certainly not a system running PHP 4. Right? :)

I wouldn't give it for granted. At least WikiApiary statistics should be checked, there are probably thousands of wikis running on PHP 4.

Anyway, I just wanted to say: whatever the conclusion is on what's the way to use InstantCommons, please remember to update https://www.mediawiki.org/wiki/InstantCommons well before that way is made mandatory (by breaking the other ways).

Liuxinyu970226 added a subscriber: Liuxinyu970226.Via WebJun 22 2015, 11:24 AM
demon added a comment.Via WebJun 22 2015, 2:19 PM

Surely we have to draw the line of where the oldest base system we support running MediaWiki on is and that line is certainly not a system running PHP 4. Right? :)

I wouldn't give it for granted. At least WikiApiary statistics should be checked, there are probably thousands of wikis running on PHP 4.

Lucky for us we dropped PHP4 years ago...prior to InstantCommons even existing.

Nemo_bis added a comment.Via WebJun 24 2015, 8:30 PM

1.6.12 was in 2009, not that long ago. ;)

demon added a comment.Via WebJun 24 2015, 9:13 PM

That's 6 years ago, yeah pretty long.

Also, it never supported InstantCommons, so who cares about PHP4 at all?

Legoktm added subscribers: Krinkle, Legoktm.Via WebJun 25 2015, 6:36 AM

Our CA for production is GlobalSign. It is one of the big (in terms of websites using it) and oldest CAs. The top-most root certificate of our chain was issued in 1998 and has been in operating systems and browsers for at least 15 years now. GlobalSign has a datasheet detailing their compatibility (ignore the first "Extended Validation" section; we don't use that). The compatibility includes browsers such as MSIE 5.01+, all versions ever released of Firefox, Google Chrome and Safari (not to mention Netscape Communicator 4.51+!) plus all versions of mobile phone browsers ever since Windows CE 4.0. I would very much like to hear in which CA stores is this actually a problem and for how many users.

Today @Krinkle and I discovered that using homebrew's python on OSX does not have the right certs installed, resulting in:

Python 2.7.10 (default, May 26 2015, 13:01:57)
[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import urllib2
>>> urllib2.urlopen('https://commons.wikimedia.org/w/api.php')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 154, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 431, in open
    response = self._open(req, data)
  File "/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 449, in _open
    '_open', req)
  File "/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 409, in _call_chain
    result = func(*args)
  File "/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1240, in https_open
    context=self._context)
  File "/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 1197, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)>

I found https://github.com/Homebrew/homebrew/issues/39870 which seemed to explain what's going on.

hashar added a comment.Via WebJun 25 2015, 12:31 PM

Today @Krinkle and I discovered that using homebrew's python on OSX does not have the right certs installed, resulting in:
...
I found https://github.com/Homebrew/homebrew/issues/39870 which seemed to explain what's going on.

That works for me. You might want to reinstall the brew versions of openssl and python.

Krinkle added a comment.Via WebJun 25 2015, 5:10 PM

Today @Krinkle and I discovered that using homebrew's python on OSX does not have the right certs installed, resulting in:
...
I found https://github.com/Homebrew/homebrew/issues/39870 which seemed to explain what's going on.

That works for me. You might want to reinstall the brew versions of openssl and python.

Already did that. Fresh install from Homebrew of python (2.7.10), and openssl (openssl 1.0.2c). The error remains.

Most likely this is because there are some outdated root certificates OS X has lying around. Python v2.7.9 introduced stricter checking of certificates. When using the default Mac python (2.7.6) it works fine, because it doesn't do the strict checking. However with the old Python, installing JJB doesn't work well due to conflicting base packages. It works in a virtualenv though, so I did that for now.

Bawolff added a comment.Via WebJun 28 2015, 2:27 AM

I was recently playing around with amazon's cloud stuff. And I tried installing mediawiki. Much to my surprise instant commons is broken.

Apparently, the default php you get from apt-get install php5 (5.5.9-1ubuntu4.9), does not have the necessary certs (Or something else was wrong with fopen wrappers for https, but openssl extension was installed so I think certs is most likely issue). Anyways, main point is it didn't work out of the box. I didn't investigate for very long until installing php5-curl which fixed things. But we want instant commons to just work even if the administrator does not know what they are doing.

Finally, this is not even end-users, these are users *running MediaWiki*. To draw a comparison, the CA has been in operating systems' CA stores since the time PHP 4 was new and was all the rage :) Surely we have to draw the line of where the oldest base system we support running MediaWiki on is and that line is certainly not a system running PHP 4. Right? :)

A large portion of our install base are not sophisticated sysadmins. Given the support questions that get asked on irc, often its lucky if the user even knows what the phrase "ssh into the server and do X" means. I'm not worried about old versions, I'm worried about versions that are commonly in use right now

You make fair points about distributing ssl certs being a bad idea, though.

Restricted Application added a subscriber: Steinsplitter. · View Herald TranscriptVia HeraldJun 28 2015, 2:27 AM
Steinsplitter added a project: Commons.Via WebJun 28 2015, 6:23 AM
Steinsplitter moved this task to Backlog on the Commons workboard.
Tau added a subscriber: Tau.Via WebJun 28 2015, 12:21 PM

Hello to everyone! This is my first post here. I am an administrator of one wikiproject. I had set InstantCommons true in Localsettings.php and my wiki was showing Commons files correctly. After this http to https change InstantCommons is not working in my wiki anymore. Only red links of images are shown. I have searched for solutions and tried to solve this problem but still unsuccessfully. I have read this topic several times but mostly I can't understand what you are talking about. I have Ubuntu 14.04.2, PHP 5.5.9, Mediawiki 1.23.3 versions, SSH access via command line and FileZilla.

Is the problem in outdated certificates? Anyways, can someone explain how to solve this issue and get InstantCommons working again so that ordinary people also can understand it?

Reedy added a subscriber: Reedy.Via WebJun 28 2015, 12:40 PM

@Bawolff was taking about this last night, see his comment above.

Can you install the php5-curl package, restart your webserver and see how things get on?

Tau added a comment.Via WebJun 28 2015, 12:46 PM

Okay, I will try and let you know.

Seb35 added a subscriber: Seb35.Via WebJul 1 2015, 12:22 AM

With HTTPS mandatory for InstantCommons, the php5-curl package / curl PHP extension becomes mandatory to use InstantCommons.

The fallback background PhpHttpRequest suffers of two issues related to HTTPS: T75203 (a certificate repository must be set) and T75199 (the wildcard certificate must be handled).

Tau added a comment.Via WebJul 1 2015, 12:48 PM

I have installed the php5-curl now but still Instantcommons isn't working properly. What next?

Bawolff added a comment.Via WebJul 1 2015, 10:30 PM

I have installed the php5-curl now but still Instantcommons isn't working properly. What next?

Most likely, follow instructions at https://snippets.webaware.com.au/howto/stop-turning-off-curlopt_ssl_verifypeer-and-fix-your-php-config/ to set curl.cafile in php.ini

Tau added a comment.Via WebJul 7 2015, 2:53 PM

I downloaded 2 php.ini files via FileZilla from directories /etc/php5/apache2/ and /etc/php5/cli/.

Changed ;curl.cainfo = to curl.cainfo = "/etc/ssl/certs/ca-certificates.crt" in both files and uploaded them to server, after that replaced original files with new ones.

Then restarted server by sudo apache2ctl graceful and sudo service apache2 restart.

Still nothing ... Any further recommendations??

Sumit added a subscriber: Sumit.Via WebJul 8 2015, 5:44 PM
Tgr added a comment.Via WebJul 8 2015, 11:08 PM

Still nothing ... Any further recommendations??

Make sure you have logging enabled then grep for ForeignAPIRepo: ERROR on GET:.

BBlack added a comment.Via WebJul 13 2015, 4:26 PM

So, where are we at on removing the redirection workarounds here? I'd still like to get these removed ASAP. Have we released new software with https:// URLs? Does MW's http-fetching stuff in general always allow protocol-only redirects (this may affect internal cases of fetches from meta, too).

Tgr added a comment.Via WebJul 13 2015, 6:35 PM

Have we released new software with https:// URLs?

No. The InstantCommons patch still needs to be merged to 1.24 and 1.23, and all supported versions (1.23, 1.24, 1.25) need a new release. And presumably some grace period after that.

The blocking tasks also need some work.

Does MW's http-fetching stuff in general always allow protocol-only redirects (this may affect internal cases of fetches from meta, too).

It does not allow redirects at all, unless explicitly configured. If redirects are enabled, there is no way to restrict them in any way.
See also T103043 (not sure if it should be a blocker).

demon added a comment.Via WebJul 13 2015, 6:47 PM

So, where are we at on removing the redirection workarounds here?

Patches have been merged to all supported branches.

I'd still like to get these removed ASAP. Have we released new software with https:// URLs?

No, no release yet. 1.25.2 (T93267) is supposed to be a security release but it's not quite ready yet. If this is super urgent, I suppose we could do a release and just push back the security one to 1.25.3...

Does MW's http-fetching stuff in general always allow protocol-only redirects (this may affect internal cases of fetches from meta, too).

We don't follow redirects by default in our http-fetching stuff. It'd be up to an explicit codepath to turn that on. When it is on, it doesn't make any sort of assumptions about protocols, etc. We could probably improve things there...

BBlack added a comment.Via WebJul 14 2015, 12:00 AM

So, where are we at on removing the redirection workarounds here?

Patches have been merged to all supported branches.

I'd still like to get these removed ASAP. Have we released new software with https:// URLs?

No, no release yet. 1.25.2 (T93267) is supposed to be a security release but it's not quite ready yet. If this is super urgent, I suppose we could do a release and just push back the security one to 1.25.3...

Just to clear up confusion with the @Tgr's comment as well: Are we planning to release for 1.23 and 1.24 as well, or just 1.25?

This whole thing doesn't really fit my definition of "Super Urgent", but on the other hand it's now been about a month, and I was expecting more like 2-3 weeks to pulling the exception out of varnish, and no real end in sight yet. If the plan is to take another month or two, then yeah, we need to do something about fixing this sooner. As far as I'm concerned, the cause of this is our own broken software. We've been harder on external breakage than we're being with ourselves here...

Does MW's http-fetching stuff in general always allow protocol-only redirects (this may affect internal cases of fetches from meta, too).

We don't follow redirects by default in our http-fetching stuff. It'd be up to an explicit codepath to turn that on. When it is on, it doesn't make any sort of assumptions about protocols, etc. We could probably improve things there...

Perhaps this part needs to be a separate task, but my feeling is that we'll continue to see breakage internally and externally so long as we don't fix the general case here. Anything that acts as HTTP[S]-fetching code in MediaWiki should always follow a protocol redirect (as in, nothing about the URL changes except the protocol switch from http to https), regardless of any sort of security-focused "don't follow redirects" flag.

Just to clear up confusion with the @Tgr's comment as well: Are we planning to release for 1.23 and 1.24 as well, or just 1.25?

I figured all 3 since patches were made for all of them...but I don't really care tbh

This whole thing doesn't really fit my definition of "Super Urgent", but on the other hand it's now been about a month, and I was expecting more like 2-3 weeks to pulling the exception out of varnish, and no real end in sight yet.

Sorry if we set a bad expectation here...releases never happen on time :)

If the plan is to take another month or two, then yeah, we need to do something about fixing this sooner. As far as I'm concerned, the cause of this is our own broken software. We've been harder on external breakage than we're being with ourselves here...

To be honest, even if I released the software tomorrow, we'll still see a long tail of people upgrading. You're still looking at months...

If we don't really care, then why not just remove the exceptions today? InstantCommons is an optional feature, off by default, and the broken behavior can be configured around.

Perhaps this part needs to be a separate task, but my feeling is that we'll continue to see breakage internally and externally so long as we don't fix the general case here. Anything that acts as HTTP[S]-fetching code in MediaWiki should always follow a protocol redirect (as in, nothing about the URL changes except the protocol switch from http to https), regardless of any sort of security-focused "don't follow redirects" flag.

Yeah. The whole behavior around redirection here is wonky. We should definitely fix it up. Filed T105765 for it.

gerritbot added a comment.Via ConduitJul 14 2015, 1:56 AM

Change 224557 had a related patch set uploaded (by BBlack):
HTTPS redirects: remove InstantCommons exception

https://gerrit.wikimedia.org/r/224557

BBlack added a comment.Via WebJul 14 2015, 1:58 AM

Change prepped so it's easy. I'm open to debate on timing (I tend to think we should least have a software release available).

MZMcBride added a subscriber: MZMcBride.Via WebJul 14 2015, 4:43 AM
Nemo_bis added a comment.Via WebJul 14 2015, 5:18 AM

I tend to think we should least have a software release available

The necessary steps also must be documented. At least one of https://www.mediawiki.org/wiki/InstantCommons, https://www.mediawiki.org/wiki/Manual:$wgInstantCommons and https://www.mediawiki.org/wiki/Manual:$wgForeignFileRepos needs update.

Did Tau find the solution to their problem?

Tgr added a comment.Via WebJul 16 2015, 1:45 AM

I don't see anything needing change on any of those.

Tgr added a comment.Via WebJul 16 2015, 1:48 AM

I don't see anything needing change on any of those.

Probably because Bawolff fixed them three weeks ago.

Tau added a comment.Via WebJul 17 2015, 7:56 AM

Still nothing ... Any further recommendations??

Make sure you have logging enabled then grep for ForeignAPIRepo: ERROR on GET:.

Did Tau find the solution to their problem?

I created mediawiki folder to /var/log/ (permissions rwx for both user and groups). Then added line $wgDebugLogFile = "/var/log/mediawiki/debug-{$wgDBname}.log"; to LocalSettings.php. Made few edits on wiki page but debug file was not created. Then added manually text file debug-my_wiki.log (permissions rwx for both user and groups) to /var/log/mediawiki/ made edits on wiki page, restarted server but still can't get the debug info saved into that file. What I am doing wrong?

Tgr added a comment.Via WebJul 17 2015, 9:38 AM

Manual:$wgDebugLogFile recommends checking open_basedir. Also, are you sure the user MediaWiki runs under (probably www-data) is in the right group to write the file?

Tau added a comment.Via WebJul 17 2015, 11:29 AM

In both php.ini files the ";open_basedir= " (blank). Is it okay? Is this string set anywhere else too in addition to these two php.ini-s?

I typed "groups www-data" to command line and received "www-data : www-data" - this should be okay?

Tgr added a comment.Via WebJul 17 2015, 4:14 PM

In both php.ini files the ";open_basedir= " (blank). Is it okay? Is this string set anywhere else too in addition to these two php.ini-s?

If you are running your own webserver (not some cheap shared host) and you didn't set it explicitly, it's not enabled.

I typed "groups www-data" to command line and received "www-data : www-data" - this should be okay?

That means for a file to be writable, one of these must be true:

  • the owner of the file is www-data
  • the group of the file is www-data
  • the file is world-writable (ie. rw-rw-rw or something like that)
Tau added a comment.Via WebJul 17 2015, 5:08 PM

Should the open_basedir be enabled then? What directory I should set to open_basedir?

If I change the owner of the /var/log/mediawiki folder to www-data then has it permission to write this file?

Or what I should do exactly?

Tgr added a comment.Via WebJul 17 2015, 6:50 PM

If it is not a large production website, I would just do chmod -R a+rw /var/log/mediawiki.

Tau added a comment.Via WebJul 25 2015, 9:51 AM

Finnally I managed to get the logging enabled.

  1. I did chmod -R a+rw /var/log/mediawiki - still nothing
  2. Created manually log file /var/log/mediawiki/debug-mywiki.log - still nothing
  3. sudo chown www-data:www-data /var/log/mediawiki - manually created log file disappeared
  4. sudo chown www-data:www-data /var/log/mediawiki/debug-mywiki.log - still the same situation
  5. sudo chown admin:admin /var/log/mediawiki - manually created log file appeared again but instead of file size 0 bytes file size 50 982 bytes was shown thus the log was finally written.

BUT no such phrase as ForeignAPIRepo is included in it. How to get it?

Nemo_bis added a comment.Via WebJul 25 2015, 12:24 PM

BUT no such phrase as ForeignAPIRepo is included in it. How to get it?

Have you tried loading and purging some pages with remote images, to force thumbnail creation?

Tau added a comment.Via WebJul 26 2015, 3:40 PM

I have tried purging but with no success. Can turning ImageMagick on/off affect this issue? I will try some maintenance scripts next week.

Tau added a comment.Via WebAug 2 2015, 10:10 AM

I tried several maintenance scripts (purgeList, checkImages, rebuildImages etc.) but none of them helped. It's getting quite annoying already...

Tgr added a comment.Via WebAug 2 2015, 11:06 PM

You could try to cherry-pick https://gerrit.wikimedia.org/r/#/c/223518/ and set $wgDebugLogGroups['http'] = <some custom log file>.

Tau added a comment.Via WebAug 3 2015, 6:38 PM
  1. Do I run it via command line? git fetch https://gerrit.wikimedia.org/r/mediawiki/core refs/changes/18/223518/4 && git cherry-pick FETCH_HEAD
  2. In which directory I should run it?
  3. This $wgDebugLogGroups['http'] = <some custom log file> goes to LocalSettings.php?
Tgr added a comment.Via WebAug 3 2015, 7:28 PM

The easiest way is probably to run curl -s 'https://git.wikimedia.org/patch/mediawiki%2Fcore/64446397925576210c50baedc77becb470df84e2' | patch -p1 in the MediaWiki main directory (what you wrote would work too if you installed MediaWiki via git). $wgDebugLogGroups should go into LocalSettings.php.

Tau added a comment.Via WebAug 5 2015, 9:18 AM

Tau error.bmp

Have some problems with running this patch...

greg removed a subscriber: greg.Via WebAug 5 2015, 4:56 PM
Tgr added a comment.Via WebAug 5 2015, 6:31 PM

It sounds like you already had that commit locally. How did you install MediaWiki, tarballs or a git checkout?

Tau added a comment.Via WebAug 6 2015, 9:47 AM

Mediawiki was installed by extracting tar file, following these instructions found here: https://www.howtoforge.com/how-to-install-mediawiki-on-ubuntu-14.04

Tgr added a comment.Via WebAug 6 2015, 7:24 PM

Apparently git.wikimedia.org patch pages are HTML, not plaintext. How fun.

So here's a command that works:

curl -s 'https://github.com/wikimedia/mediawiki/commit/64446397925576210c50baedc77becb470df84e2.patch' | patch -p1

You might have to restore the files includes/HttpFunctions.php and includes/filerepo/ForeignAPIRepo.php to their original version first.

Tau added a comment.Via WebAug 7 2015, 8:45 AM

Which is the easiest way to restore the files includes/HttpFunctions.php and includes/filerepo/ForeignAPIRepo.php to their original version if I don't have backup of them?

Tgr added a comment.Via WebMon, Aug 10, 3:01 AM

If you followed the instructions in https://www.howtoforge.com/how-to-install-mediawiki-on-ubuntu-14.04 exactly (ie. you are using MW 1.23.3), you can just copy it from HttpFunctions.php and ForeignAPIRepo.php.

Tau added a comment.Via WebWed, Aug 12, 6:23 AM

I saved HttpFunctions.php and ForeignAPIRepo.php as text files to my computer, then copied content from these files and replaced content in server files.

Still have trouble with this patch:

/var/www/html/mediawiki$ curl -s 'https://github.com/wikimedia/mediawiki/commit/64446397925576210c50baedc77becb470df84e2.patch' | patch -p1
patching file includes/HttpFunctions.php
Hunk #2 FAILED at 75.
1 out of 2 hunks FAILED -- saving rejects to file includes/HttpFunctions.php.rej
patching file includes/filerepo/ForeignAPIRepo.php
Hunk #2 FAILED at 523.
1 out of 2 hunks FAILED -- saving rejects to file includes/filerepo/ForeignAPIRepo.php.rej
demon removed a subscriber: demon.Via WebWed, Aug 12, 1:50 PM
Tgr added a comment.Via WebFri, Aug 14, 9:57 PM

I'll just upload the correct files then: ForeignAPIRepo.php

HttpFunctions.php

BBlack added a comment.Via WebMon, Aug 17, 9:27 PM

So, I see new releases a week ago for 1.2[345] containing the InstantCommons fix. Also, it's been about a month since the last time I complained about this issue. I'm leaning towards merging https://gerrit.wikimedia.org/r/#/c/224557/ sometime this week, unless anyone else has objections.

Tau added a comment.Via WebTue, Aug 18, 9:13 AM

I'll just upload the correct files then: ForeignAPIRepo.php

HttpFunctions.php

I replaced old files with these files but still the log file is blank (tried changing user permissions and owners etc.).

Can this issue be fixed if I upgrade from 1.23.3 to 1.23.10?

Tgr added a comment.Via WebWed, Aug 19, 10:22 PM

Can this issue be fixed if I upgrade from 1.23.3 to 1.23.10?

Possibly. You might be using an old branch where certain logging changes were not backported yet.

@BBlack: there are two blocking tasks with unmerged patches, and they fix pretty serious issues with fopen. Do we know how many people use fopen vs. curl?

BBlack added a comment.Via WebWed, Aug 19, 10:46 PM

@Tgr Do we have any immediate plans to fix those anyways, or a sane plan to fix them that would apply to the bulk of users?

BBlack added a comment.Via WebWed, Aug 19, 10:49 PM

Answering my own question: there's mw/core patches attached to both, last activity about a month ago, with comments indicating that they seem to test well. Is there some unknown blocking them?

Tgr added a comment.Via WebWed, Aug 19, 11:08 PM

Nothing apart from lack of reviewers, I think. I can review them in the next couple days, but that only makes sense if you are willing to wait for the next tarball so they can go out. It looks like fopen would be broken badly without these patches, but I have no idea if that is a big deal or not. Are there environments where curl is not available? (Shared hosting?) Do we maybe log user agents so we can tell whether PHP fopen is used frequently?

Tgr added a comment.Via WebWed, Aug 19, 11:15 PM

To answer myself, we log them in api-feature-usage but I have no idea how to tell curl vs. fopen from that. Curl uses MediaWiki/<version> and fopen uses whatever is configured in php.ini.

Nemo_bis added a comment.Via WebThu, Aug 20, 9:42 AM

Are there environments where curl is not available? (Shared hosting?)

Yes. (According to users.)

jayvdb added a subscriber: jayvdb.Via WebSat, Aug 22, 3:10 AM
BBlack added a comment.Via WebWed, Aug 26, 12:33 PM

Pinged both of those tasks. I'm pretty much out of patience with waiting for PHP to suddenly become a less-horrible platform for making requests over the Internet.

Seb35 added a comment.Via WebThu, Aug 27, 9:52 AM

Both tasks have patches written by @Bawolff and tested by me, waiting for reviewers and +2.

@BBlack: the good news is PHP 5.6 correctly speaks HTTPS without curl (SAN and CA repo), the nightmare is stopping :-/

Jdforrester-WMF moved this task to Next up on the Multimedia workboard.Via WebFri, Sep 4, 6:43 PM

Add Comment