Options:
- # Session Start: Tue Aug 27 00:00:01 2013
- # Session Ident: #testing
- # [01:00] * heycam|away is now known as heycam
- # [01:31] * Quits: abarsto (~abarsto@public.cloak) ("Leaving.")
- # [02:24] * Quits: rhauck (~Adium@public.cloak) ("Leaving.")
- # [02:33] * Quits: jhammel (~jhammel@public.cloak) ("leaving")
- # [03:24] * Disconnected
- # [03:28] * Attempting to rejoin channel #testing
- # [03:28] * Rejoined channel #testing
- # [03:31] * Quits: krijnh (~krijnhoetmer@public.cloak) (Ping timeout: 180 seconds)
- # [04:31] * heycam is now known as heycam|away
- # [04:51] * heycam|away is now known as heycam
- # [05:29] * heycam is now known as heycam|away
- # [06:00] * Quits: glenn (~gadams@public.cloak) (Client closed connection)
- # [06:34] * heycam|away is now known as heycam
- # [07:11] * Joins: glenn (~gadams@public.cloak)
- # [07:18] * Quits: glenn (~gadams@public.cloak) (Ping timeout: 180 seconds)
- # [07:41] * Joins: glenn (~gadams@public.cloak)
- # [08:33] * Quits: glenn (~gadams@public.cloak) (Client closed connection)
- # [09:16] * Joins: darobin (rberjon@public.cloak)
- # [09:27] * Joins: Ms2ger (~Ms2ger@public.cloak)
- # [09:28] * Joins: zcorpan (~zcorpan@public.cloak)
- # [09:31] * Joins: zcorpan_ (~zcorpan@public.cloak)
- # [09:35] * Quits: zcorpan (~zcorpan@public.cloak) (Ping timeout: 180 seconds)
- # [09:43] * Joins: glenn (~gadams@public.cloak)
- # [09:50] * Quits: glenn (~gadams@public.cloak) (Ping timeout: 180 seconds)
- # [10:25] * Joins: dom (dom@public.cloak)
- # [10:45] * heycam is now known as heycam|away
- # [11:29] * Joins: AutomatedTester (~AutomatedTester@public.cloak)
- # [12:29] * Joins: abarsto (~abarsto@public.cloak)
- # [12:29] * abarsto is now known as ArtB
- # [13:25] * Disconnected
- # [13:27] * Attempting to rejoin channel #testing
- # [13:27] * Rejoined channel #testing
- # [13:30] * Quits: krijn (~krijnhoetmer@public.cloak) (Ping timeout: 180 seconds)
- # [13:55] * Joins: gitbot (~gitbot@public.cloak)
- # [13:55] -gitbot:#testing- [web-platform-tests] jgraham closed pull request #307: Add additional tests for the named getter in form with other listed elements. (master...form-named-getter-listed) https://github.com/w3c/web-platform-tests/pull/307
- # [13:55] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
- # [14:28] <jgraham> darobin: Uh, I meant the git sha1
- # [14:28] <darobin> oh
- # [14:28] <darobin> duh
- # [14:28] <darobin> yeah
- # [14:28] <darobin> smart man
- # [14:28] <jgraham> :)
- # [14:28] * Ms2ger feels like he's missed something
- # [14:29] <jgraham> Ms2ger: public-test-infra
- # [14:29] <Ms2ger> And thanks for the merge
- # [14:29] <jgraham> Thanks for the test (and the ones from yesterday)
- # [14:30] <jgraham> darobin: Oh and it looks like you only replied to me rather than to the whole list
- # [14:30] <darobin> jgraham: actually, you sent that to me :)
- # [14:30] <jgraham> Oh
- # [14:30] <jgraham> Dammit
- # [14:31] <jgraham> I keep doing that. This is the first time I know of that I haven't noticed before pressing send though
- # [14:33] <jgraham> OK, resent my message to public-test-infra with a bit more detail on what, exactly, I had in mind
- # [14:33] * Joins: gitbot (~gitbot@public.cloak)
- # [14:33] -gitbot:#testing- [web-platform-tests] AutomatedTester synchronize pull request #305: Adding visibility tests to show elements with HIDDEN attribute are not v... (master...hidden) https://github.com/w3c/web-platform-tests/pull/305
- # [14:33] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
- # [15:31] * Joins: glenn (~gadams@public.cloak)
- # [15:42] * Joins: gitbot (~gitbot@public.cloak)
- # [15:42] -gitbot:#testing- [web-platform-tests] AutomatedTester pushed 1 new commit to master: https://github.com/w3c/web-platform-tests/commit/d895e16504ca77ceb490b38a49b000d3e9539add
- # [15:42] -gitbot:#testing- web-platform-tests/master d895e16 AutomatedTester: Adding visibility tests to show elements with HIDDEN attribute are not visble
- # [15:42] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
- # [15:43] * Joins: gitbot (~gitbot@public.cloak)
- # [15:43] -gitbot:#testing- [web-platform-tests] AutomatedTester closed pull request #305: Adding visibility tests to show elements with HIDDEN attribute are not v... (master...hidden) https://github.com/w3c/web-platform-tests/pull/305
- # [15:43] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
- # [15:54] * Quits: darobin (rberjon@public.cloak) (Client closed connection)
- # [16:42] * Joins: Automate_ (~AutomatedTester@public.cloak)
- # [16:46] * Joins: Automat__ (~AutomatedTester@public.cloak)
- # [16:47] * Quits: AutomatedTester (~AutomatedTester@public.cloak) (Client closed connection)
- # [16:51] * Joins: botte (~botte@public.cloak)
- # [16:52] * Quits: botte (~botte@public.cloak) ("Page closed")
- # [16:52] * Quits: Automate_ (~AutomatedTester@public.cloak) (Ping timeout: 180 seconds)
- # [16:58] * Quits: Automat__ (~AutomatedTester@public.cloak) (Client closed connection)
- # [17:01] * Quits: zcorpan_ (~zcorpan@public.cloak) (Client closed connection)
- # [17:01] * Joins: zcorpan (~zcorpan@public.cloak)
- # [17:07] * Joins: AutomatedTester (~AutomatedTester@public.cloak)
- # [17:09] * Quits: zcorpan (~zcorpan@public.cloak) (Ping timeout: 180 seconds)
- # [17:32] * Joins: zcorpan (~zcorpan@public.cloak)
- # [17:43] * Joins: rhauck (~Adium@public.cloak)
- # [17:46] * Joins: krisk (~krisk@public.cloak)
- # [17:56] * Joins: thayakawa (~thayakawa@public.cloak)
- # [17:57] * Joins: kkershaw (~kkershaw@public.cloak)
- # [17:58] <krisk> Hello and welcome!
- # [17:58] <Ms2ger> Hello and goodbye :)
- # [17:58] <krisk> We can chat about HTML5 testing
- # [17:58] * Ms2ger wanders off to dinner
- # [17:59] * Quits: AutomatedTester (~AutomatedTester@public.cloak) (Client closed connection)
- # [18:00] <krisk> kkershaw any questions?
- # [18:02] <kkershaw> hi - sorry - just got a phone call when you sent your welcome. will be back in a minute.
- # [18:02] * Joins: jmdyck_web (~jmdyck_web@public.cloak)
- # [18:06] * Joins: botte (~botte@public.cloak)
- # [18:06] <kkershaw> back again - I was wondering about the review process for test submissions. For example - the opera submission of media tests has been in review for quite a while.
- # [18:07] <kkershaw> Is there an expectation about how quickly reviews will get done, the priority of doing them, etc. Is this all pretty much a best effort enterprise?
- # [18:08] <kkershaw> Also, how does one become a reviewer for someone else's submission?
- # [18:08] <Ms2ger> Best effort
- # [18:09] <Ms2ger> You become a reviewer by doing reviews :)
- # [18:15] <kkershaw> OK - on the pull request in question (#93), if I bring it up in Critic, there is a list of 5 people noted as "Needs Review From" and each has the comment of "Wake Up".
- # [18:15] <Ms2ger> I'm probably on that list
- # [18:16] <kkershaw> Yup. How did you and the other 4 get there?
- # [18:16] <kkershaw> And, it says
- # [18:16] <kkershaw> "Wake Up!" next to everybody's name....
- # [18:17] <Ms2ger> On https://critic.hoppipolla.co.uk/home, you can set up filters
- # [18:17] <Ms2ger> And whenever a PR is made that matches one of your filters, you're tagged as a reviewer
- # [18:18] <Ms2ger> So in particular, I'm tagged for everything in html/, even though I don't really know anything about media
- # [18:18] <kkershaw> that helps...thanks
- # [18:19] * Quits: rhauck (~Adium@public.cloak) ("Leaving.")
- # [18:19] <kkershaw> Given that, what has to happen before this particular pull request is actully brought into the master? What are the gating items?
- # [18:19] <Ms2ger> Someone has to step up to do the review
- # [18:19] <jgraham> The gating item is that someone actually does the review
- # [18:20] <kkershaw> just one person?
- # [18:20] <jgraham> Or a group of people
- # [18:20] <jgraham> It doesn't really matter if the review is done well
- # [18:20] <kkershaw> argh!
- # [18:20] <Ms2ger> Well, preferably, it's done well
- # [18:20] <Ms2ger> But a cursory review is better than no review
- # [18:21] <jgraham> kkershaw: So, the problem is, that we don't have any sticks here
- # [18:21] <Ms2ger> And not a lot of carrots either
- # [18:21] <kkershaw> sure - I'm beginning to understand about the sticks and carrots
- # [18:21] <jgraham> We can't impose a formal process and say "do it like this or badness will befall you"
- # [18:21] <jgraham> The only carrot that we have is that tests are good
- # [18:22] <jgraham> Once people are actually running the tests then they have a motivation to organise review because they want to get the tests into their automation
- # [18:22] <krisk> I expect to spend more time on this in the Fall
- # [18:22] <jgraham> But at the moment none of Gecko/WebKit/Blink run all the tests all the time
- # [18:23] <jgraham> I'm working on changing this for Gecko. Others are working on changing it for Blink/WebKit
- # [18:23] <jgraham> (possibly just Blink, I'm not sure)
- # [18:23] <kkershaw> Yup. So, in this case (PR #93), it's the content of the entire PR that will be merged into the master whenever a review is done.
- # [18:23] <Ms2ger> Yes
- # [18:24] <jgraham> Right
- # [18:24] <Ms2ger> (The fact that it's 17k lines of tests at once is not terribly helpful)
- # [18:25] <krisk> Microsoft also runs the tests, especially when we are implementing a standard
- # [18:25] <kkershaw> Yeah - small PRs are a little easier to deal with. OK, so if I said that I had reviewed all these tests and that they were good, would that trigger the merge?
- # [18:25] <jgraham> kkershaw: Yes
- # [18:26] <jgraham> If you are intending to review such a large number of tests, and particularly if you are intending to do it as a group, I recommend using critic since you can track what has been reviewed
- # [18:26] <jgraham> And the test author can track which issues they have fixed
- # [18:26] <jgraham> Without resorting to pieces of paper
- # [18:26] <kkershaw> Yeah, Critic looks good. Thanks for pointing me to that. But you don't know if I'm competent to review.
- # [18:27] <jgraham> Well I think that is another problem we have
- # [18:27] <jgraham> It's kind of easy for everyone to feel like "I'm not the best reviewer for this"
- # [18:28] <kkershaw> That doesn't matter? The code author just needs someone to vote in their favor and away we go...?
- # [18:28] <jgraham> Well of course it is better if the person reviewing has some understanding of the spec
- # [18:28] <jgraham> But I don't think you should need to understand the spec better to review tests than to write tests
- # [18:30] <kkershaw> So, if I review and I suggest a change, will it be up to the author to act on that suggestion? And they could either accept it, reject it, modify, etc..?
- # [18:30] <jgraham> So, for example, sometimes I have reviewed things where I haven't really read the spec before and taken the review as a chance to read the spec
- # [18:30] <jgraham> Right. If you raise an issue, they are expected to address it unless they can argue convincingly that it doesn't need to be addressed
- # [18:31] <jgraham> (there isn't a defined escalation process, but in practice it isn't a problem)
- # [18:31] <jgraham> Or someone else can come along and address the issue
- # [18:31] <jgraham> It doesn't have to be the original author
- # [18:32] <jgraham> (although github makes that a bit harder with most reviews because they are tied to personal forks)
- # [18:32] <jgraham> (which makes me sad)
- # [18:34] <kkershaw> Good - so here's an example. Having looked through some of the tests in the opera media submission, I noticed that they don't have the "href" tag that provides a spec reference link.
- # [18:34] <jgraham> Anyway, the general feeling that I'm trying to convey is that we're aiming for review of everything, but it isn't expected that review is perfect, more that it is best-effort, and that because of the difficult environment, we try to make the process as flexible as possible
- # [18:34] <Ms2ger> If you want those, you'll probably need to add them yourself :)
- # [18:34] <kkershaw> The guide on writing tests says that should be there. Is this a reasonable suggestion for a change? And Could I add it myself?
- # [18:34] <jgraham> Right, the whole href thing is controversial. See for example darobin's recent post to the list which suggets that these should be in an external file
- # [18:34] <jgraham> Or external system, really
- # [18:35] <kkershaw> Right....been following that and hoping for convergence but not really seeing it.
- # [18:35] <Ms2ger> It's not bad to have them (imo), but I don't see anyone rushing to add them after the fact :)
- # [18:35] <jgraham> So we haven't tried to enforce that. Also it makes importing previously written tests like these rather difficult since noone has organically decided to add those links
- # [18:36] <jgraham> in their own testsuites
- # [18:36] <jgraham> So, my feeling about that specific example is that if you want that data you would be best off first getting those tests merged and then submitting the assert links as a seperate PR
- # [18:37] <jgraham> I doubt that zcorpan is going to spend his time adding them
- # [18:37] <jgraham> (and I imagine that if he did, Opera would regard that as a low value activity)
- # [18:38] <kkershaw> Understood. Sorry to fixate on this but the people who hold carrots and sticks out for me are often REALLY interested in assessing how well tests provide coverage of a spec. It's tough to answer that question without this kind of linkage.
- # [18:38] <jgraham> Sure
- # [18:39] <jgraham> I think that disconnect is why darobin suggests an external tool
- # [18:39] <kkershaw> Now for something completely different
- # [18:39] <jgraham> (personally I'm interested in trying a different approach to this problem via instrumented builds. But that's for the future and may not work anyway)
- # [18:40] <jmdyck_web> jgraham: "darobin's recent post to the list
- # [18:40] <jmdyck_web> ": public-test-infra?
- # [18:40] <kkershaw> I joined this chat for the first time 2 weeks ago - 2 colleagues of mine are on the call today. Brian Otte and Takashi Hayakawa - both from CableLabs. I was going to ask if either of them had questions for this group.
- # [18:41] <Ms2ger> (Welcome!)
- # [18:41] <jgraham> jmdyck_web: http://www.w3.org/mid/52176470.8040000@w3.org
- # [18:41] <jgraham> (yes)
- # [18:41] <thayakawa> Oops... hi all!
- # [18:41] * jgraham waves
- # [18:42] * Joins: jhammel (~jhammel@public.cloak)
- # [18:42] <thayakawa> kershaw has been covering most of the question, so not much new from me.
- # [18:43] <thayakawa> I just want to add that...
- # [18:43] <thayakawa> The test-spec relationship would be very helpful for me from a developer's viewpoint.
- # [18:45] <Ms2ger> Which part of the spec a test is testing, you mean?
- # [18:45] <thayakawa> Yes
- # [18:46] <Ms2ger> I haven't found it very useful myself, but ymmv, of course
- # [18:46] <kkershaw> I was curious about whether there's expertise in this chat community about the IDLHarness tool. I was trying to understand what it does, how it works, and so on.
- # [18:46] <Ms2ger> I've worked on it
- # [18:47] <Ms2ger> It's mostly Aryeh's work, though, and he isn't around often
- # [18:47] <jgraham> Yeah, I discussed it as a posibility before he implemented it. So I assume it is quite like we discussed, but I don't remember lots of the details :)
- # [18:48] <Ms2ger> The idea is that you paste in the IDL snippets from a spec, and it automatically tests those parts of the WebIDL spec that it can
- # [18:48] <kkershaw> Is there any written doc on it you could point us to?
- # [18:48] <krisk> I have some test for this that make use of the IDL harness as well that I plan on submitting
- # [18:48] <Ms2ger> There's some in the file itself
- # [18:49] <Ms2ger> And there's some tests that use it in the repo
- # [18:50] * Quits: dom (dom@public.cloak) ("")
- # [18:50] <kkershaw> OK - thanks.
- # [18:51] <Ms2ger> If you have particular questions, I'm always happy to try to answer :)
- # [18:53] <kkershaw> Thanks, I really appreciate the insight and good responses from all of you. I need to head off to a meeting for now. Cheers...
- # [18:53] * Quits: botte (~botte@public.cloak) ("Page closed")
- # [18:54] * Joins: rhauck (~Adium@public.cloak)
- # [18:55] <Ms2ger> See you
- # [18:55] * Quits: thayakawa (~thayakawa@public.cloak) (" HydraIRC -> http://www.hydrairc.com <- The professional IRC Client :D")
- # [18:56] <Ms2ger> We really want your tests ;)
- # [18:58] <MikeSmith> jgraham: I asked Ms2ger and gsnedders about this already, but I want to get your feedback too: I'd like to add a web-platform-tests/conformance-checkers directory for managing test cases for the validator
- # [18:59] <MikeSmith> for (non-automated, non-testharness.js) document-conformance tests
- # [18:59] <jgraham> MikeSmith: If you plan to use them, that sounds reasonable to me
- # [19:00] <MikeSmith> OK
- # [19:00] <MikeSmith> I can set up a file-naming convention or manifest convention or whatever to further distinguish them from the other tests in the repo
- # [19:01] <MikeSmith> to keep them out of the way
- # [19:02] <Ms2ger> If they're in their own dir, do whatever, IMO
- # [19:02] <MikeSmith> I'm already using them btw. The value to me in having the in the w-p-t repo is to piggback off the network effects of testtwf etc.
- # [19:02] <MikeSmith> ok
- # [19:02] <jgraham> I guess a file naming convention helps, if you don't mind doing it
- # [19:02] <jgraham> Or call them all -manual
- # [19:02] <jgraham> Or whatever we decided for that
- # [19:03] <MikeSmith> -manual would work fine for me
- # [19:04] <MikeSmith> as long as I could also do e.g., 001-notvalid-manual.html
- # [19:04] <MikeSmith> because I'm already using "-notvalid" to flagging tests that are intentionally supposed to fail the validator
- # [19:04] <MikeSmith> and I have a test runner already that handles that
- # [19:05] <jgraham> Nice
- # [19:05] <MikeSmith> basically it's what we have been using already for regression testing of the valiator
- # [19:06] <jmdyck_web> are they actually manual? (or does manual not mean what i think it means?)
- # [19:07] <MikeSmith> jmdyck_web: they are non-manual in that they don't run under the testharness.js automation
- # [19:07] <MikeSmith> they are not browser tests
- # [19:08] <jmdyck_web> ok, so it's not a reflection of their automatability
- # [19:10] <MikeSmith> btw about the performance of the validator code, I ran the validator test runner on all 5404 HTML/XHTML test files in the w-p-t repo, and it takes about 13 seconds total to parse them all and report all them validation errors (17255 errors..). So, about 2 milliseconds per document (unless I counted wrong)
- # [19:10] <MikeSmith> jmdyck_web: yeah
- # [19:10] <MikeSmith> jmdyck_web: but there are no automation hooks of any kind in the test files themselves
- # [19:11] <MikeSmith> and none needed, as far as the validator goes
- # [19:11] <MikeSmith> they are just arbitrary HTML files
- # [19:12] <Ms2ger> MikeSmith, so you're testing valid/not valid or more than that?
- # [19:12] <jmdyck_web> Is there an indication somewhere of how the notvalid files are not valid?
- # [19:12] <jmdyck_web> (or what sort of validation error to expect?)
- # [19:13] <MikeSmith> Ms2ger: just valid/not valid at this point. The test runner doesn't get down to testing the actual errors at granularity
- # [19:13] <Ms2ger> MikeSmith, understood
- # [19:15] <MikeSmith> Ms2ger: but Henri did once write a different test mechanism that provides more specific checking against the particular expected error. So I may be able to hook into that later
- # [19:15] <jmdyck_web> MikeSmith: and this is the w3c validator, aka validator,nu?
- # [19:15] <MikeSmith> jmdyck_web: yeah
- # [19:16] <MikeSmith> jmdyck_web: and that answer to your earlier question is the same one I just gave Ms2ger :)
- # [19:18] <MikeSmith> ok the other test harness is here:
- # [19:18] <MikeSmith> https://github.com/validator/validator/tree/master/test-harness
- # [19:19] <MikeSmith> it produces output like this:
- # [19:19] <MikeSmith> $ python validator-tester.py checkall
- # [19:19] <MikeSmith> http://simon.html5.org/test/validator/character-encoding/non-ascii-past-512.html Expected 1,512;1,512: No explicit character encoding declaration has been seen yet (assumed “windows-1252”) but the document contains non-ASCII. but saw no errors.
- # [19:19] <MikeSmith> http://simon.html5.org/test/validator/character-encoding/non-ascii.html Expected 1,23;1,23: No explicit character encoding declaration has been seen yet (assumed “windows-1252”) but the document contains non-ASCII. but saw no errors.
- # [19:19] <Ms2ger> The parser tests have... something
- # [19:26] <jgraham> Yeah, someone posted a patch to update some of the information there
- # [19:26] <jgraham> I'm not sure what happened to it
- # [19:27] <jgraham> I think most people were ignoring the errors stuff because it makes writing the tests much harder and isn't a win except for the validator
- # [19:42] <jmdyck_web> MikeSmith: should conformance-checkers directory be within web-platform-tests/html? Or is it for checking conformance to other specs too?
- # [19:43] <jmdyck_web> (or should it be named html-conformance-checkers?)
- # [19:48] <MikeSmith> jmdyck_web: other specs too
- # [19:53] <jmdyck_web> cool. such as?
- # [19:55] <Ms2ger> SVG?
- # [20:11] * Joins: hober (~ted@public.cloak)
- # [20:20] * Quits: krisk (~krisk@public.cloak) (Ping timeout: 180 seconds)
- # [20:31] * Quits: glenn (~gadams@public.cloak) (Client closed connection)
- # [20:34] * Joins: thayakawa (~thayakawa@public.cloak)
- # [20:51] * Quits: ArtB (~abarsto@public.cloak) ("Leaving.")
- # [21:04] * Quits: jmdyck_web (~jmdyck_web@public.cloak) (Ping timeout: 180 seconds)
- # [21:06] * Quits: zcorpan (~zcorpan@public.cloak) (Client closed connection)
- # [21:06] * Joins: zcorpan (~zcorpan@public.cloak)
- # [21:19] * Joins: gitbot (~gitbot@public.cloak)
- # [21:19] -gitbot:#testing- [web-platform-tests] Ms2ger opened pull request #308: Make the last argument to DOMImplementation.createDocument optional. (master...createDocument-optional) https://github.com/w3c/web-platform-tests/pull/308
- # [21:19] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
- # [22:26] * Quits: kkershaw (~kkershaw@public.cloak) (" HydraIRC -> http://www.hydrairc.com <- The alternative IRC client")
- # [22:34] * Quits: jhammel (~jhammel@public.cloak) ("biab")
- # [23:30] * Joins: jhammel (~jhammel@public.cloak)
- # [23:35] * Parts: jhammel (~jhammel@public.cloak) (jhammel)
- # [23:48] * Joins: botte (~botte@public.cloak)
- # [23:49] * Joins: rhauck1 (~Adium@public.cloak)
- # [23:50] * Quits: Ms2ger (~Ms2ger@public.cloak) ("nn")
- # [23:54] * Quits: rhauck (~Adium@public.cloak) (Ping timeout: 180 seconds)
- # Session Close: Wed Aug 28 00:00:00 2013
The end :)