Software developer on the Wikidata team at Wikimedia Germany (he/him, Berlin timezone). Private account: @LucasWerkmeister.
User Details
- User Since
- Apr 3 2017, 2:45 PM (398 w, 5 d)
- Availability
- Available
- IRC Nick
- Lucas_WMDE
- LDAP User
- Lucas Werkmeister (WMDE)
- MediaWiki User
- Lucas Werkmeister (WMDE) [ Global Accounts ]
Yesterday
Well, the following works, though I don’t like it very much:
I’m confused by this test, actually. It’s called “ignore U+110000 and above”, but that doesn’t seem to match the test string very well. The hexadecimal input corresponds to:
The list in T374000#10339961 matches a codesearch pretty well (except that a handful of extra repos are included in codesearch). I think we can consider this task reviewed.
Apparently an older version of the guide had more details on how to generate analyzed-out.gz, though it’s just a guess that the cronjob was an automated version of the steps listed there (maybe Marius or Amir can confirm?).
@dcausse following up from our discussion in T376250 – now that we reuse the fields from WikibaseCirrusSearch (both for the definitions and for the actual field data), should we also try to reuse as much as possible of EntitySearchElastic for our own EntitySearchHelper implementation? (Currently the WIP patch attached above copy+pastes a lot of that code instead.)
The EntitySchema type URI is http://wikiba.se/ontology#EntitySchema and referenced in Wikidata RDF (example)
Should be deployed with next week’s train (T375664), i.e. reach Wikidata (group1) on 27 November or shortly thereafter.
Thu, Nov 21
Thanks for the example! I agree that entity-type-specific dedicated messages sound like a good solution; we can use a similar approach as in AutoCommentFormatter::getSummaryMessage() to fall back to the existing generic messages. (If the specific message only exists in English, it would still be used even if the generic message is available in the user language, but I think that’s just the generally accepted way that MediaWiki messages work.)
Prio Notes:
Prio Notes:
Prio Notes:
I would really like for the database to fail loudly when one tries to insert a string that is too long
I’d still be interested in opinions on this :)
For easier readability:
Mon, Nov 18
I see, thanks… so I guess this is one case where T243394: Automatically refresh jouncebot just before a deployment window starts was actually a negative for us – the timer elapsed, jouncebot refreshed the deployments, and saw there was nothing left to announce. It happens 🤷
Reporting this just in case, but if it doesn’t happen more often it’s probably okay to just ignore this and close it after a while.
I guess that depends on how long this task takes us on the master branch. It would be nicer to backport the relevant patch sets, but if it takes too long, that might start to cause trouble with merge conflicts. Given that no significant feature development happens on the release branch, I think I would be fine with running the autofixer there.
But it’s not broken at the moment, is it? If I understand correctly, it would break if we took down query.wikidata.org (but AFAIK that’s not even the plan – it will just become equivalent to query-main.wikidata.org), but it seems to work fine so far. And in that case I don’t see why we should expect any feedback from community members asking to fix it – I expect that whoever uses it (I don’t know who, but apparently plenty of people according to the tracking) is happy with it and might not even realize that it’ll be affected by the graph split.
Fri, Nov 15
Doesn’t seem to happen anymore (see e.g. this recent backport).
Presumably caused by Add type hints to detect bool/null message params (cc @matmarex).
git bisect shows that the difference in behavior was introduced with Lookup: Ensure input event on selection has correct value (T378177). We still need to investigate what this means and if we need a fix in new-lexeme-special-page and/or in Codex.
FTR, this change seems to have caused a regression in Special:NewLexeme :/ we’re investigating in T379595.
The above error was not reproducible in the dev entry point, which turns out to be because in the dev entry point we were still using Codex 1.14, whereas MediaWiki now ships Codex 1.16. After updating the new-lexeme-special-page repo to Codex 1.16, the error becomes reproducible in the dev entry point (and some tests start failing).
Needs some more work, unfortunately.
Thu, Nov 14
Sorry, I mixed that up – T328921: Drop PHP 7.4 support from MediaWiki is probably the right task?
AFAIK the comment at https://gitlab.wikimedia.org/repos/ci-tools/patchdemo/-/issues/552#note_48220 is still accurate, in that this would be the minimal configuration to load Wikibase repo and client:
Apparently this happens when we try to redirect from Special:ConstraintReport?entityid=Q1 to Special:ConstraintReport/Q1, but for some ?entityid= strings the resulting title is invalid (trailing underscore). Guess we just need to catch the error (or validate it first? not sure if catching a ParameterAssertionException is great) and report it as an ordinary error.
Wed, Nov 13
I’m confused by that HTML… it looks pretty much like the HTML for the form is supposed to look, except all the messages are just missing? Empty <label></label> etc.
The good news is that this seems to be a test-only breakage – the anonymous edit warning is still shown when actually loading the edit page. (Note that temporary accounts must be disabled for this.)
This is caused by a core change; git bisect says Remove getText usages in the vicinity of MessageCache is the first bad commit. (Which sounds plausible at a glance – it is at least message-related.)
Wikibase changes +2ed.
Tue, Nov 12
Reopening, as the above change was never merged and is still relevant as far as I can tell – I don’t see any changes to Wikibase itself in T356617, only to the release pipeline. I’ve just rebased the change to resolve merge conflicts (it now updates RELEASE-NOTES-1.44 rather than RELEASE-NOTES-1.41).
patch demo which historically does not support wikidata/wikibase
So who’s responsible for this task at WMDE? The current project columns kind of sound like neither the Wikidata team nor the Wikibase Product Platform Team consider themselves responsible (“Radar” / “outside WPP”), which would be unfortunate.
@ArthurTaylor and I looked into this, and eventually arrived at some promising-sounding investigation results. The most important is that in this code:
I think if the script runs relatively fast in production (if the wikitech page is accurate in saying that it takes “a few minutes”), then considering that it’s okay for the property suggester to be temporarily broken, I think the truncate-and-insert approach is okay, and option 1 still sounds good. But let’s try out how long the script takes to run under non-k8s mwscript, just so we have a current figure.
Mon, Nov 11
Thanks! IMHO 1 seems like the best option, but we’ll look into how the script behaves (I suspect it can be retried safely).
Fri, Nov 8
Thu, Nov 7
I think I misunderstood the self-service part; the linked docs only say that “Gerrit administrators may immediately act on requests from the following trusted organisations”. (I was confused by some other groups like wmde-wikidata, which seem to be viral in the same way as Trusted-Contributors: any member can add other members. But wmde-mediawiki doesn’t seem to work like that.)
Messages were introduced in FP: SpecialPages: Parameterize description for T256119 FTR.
IIUC this group is supposed to be self-service, but neither Leszek nor I are in it AFAICT? https://gerrit.wikimedia.org/r/admin/groups/8f7f4df5062198c795a6eb18c3536f3410c465fe,members
Wed, Nov 6
It’s not my decision, but given that we haven’t made any progress here in over a year, my vote is for dropping these docs completely and only keeping js/rest-api/ around.
(Also, I feel like we should probably make this a separate keyword, perhaps [AUTO_LANGUAGES], to avoid breaking queries that rely on [AUTO_LANGUAGE] returning a single language code – anything with FILTER(LANG(?itemLabel) = "[AUTO_LANGUAGE]"), for example.)
Tagging Wikidata Query UI since [AUTO_LANGUAGE] is handled there (we just replace the string before sending the query to the backend).
Tue, Nov 5
Should be resolved now, please reopen if the error recurs.