/irc-logs / w3c / #testing / 2013-08-27 / end

Options:

  1. # Session Start: Tue Aug 27 00:00:01 2013
  2. # Session Ident: #testing
  3. # [01:00] * heycam|away is now known as heycam
  4. # [01:31] * Quits: abarsto (~abarsto@public.cloak) ("Leaving.")
  5. # [02:24] * Quits: rhauck (~Adium@public.cloak) ("Leaving.")
  6. # [02:33] * Quits: jhammel (~jhammel@public.cloak) ("leaving")
  7. # [03:24] * Disconnected
  8. # [03:28] * Attempting to rejoin channel #testing
  9. # [03:28] * Rejoined channel #testing
  10. # [03:31] * Quits: krijnh (~krijnhoetmer@public.cloak) (Ping timeout: 180 seconds)
  11. # [04:31] * heycam is now known as heycam|away
  12. # [04:51] * heycam|away is now known as heycam
  13. # [05:29] * heycam is now known as heycam|away
  14. # [06:00] * Quits: glenn (~gadams@public.cloak) (Client closed connection)
  15. # [06:34] * heycam|away is now known as heycam
  16. # [07:11] * Joins: glenn (~gadams@public.cloak)
  17. # [07:18] * Quits: glenn (~gadams@public.cloak) (Ping timeout: 180 seconds)
  18. # [07:41] * Joins: glenn (~gadams@public.cloak)
  19. # [08:33] * Quits: glenn (~gadams@public.cloak) (Client closed connection)
  20. # [09:16] * Joins: darobin (rberjon@public.cloak)
  21. # [09:27] * Joins: Ms2ger (~Ms2ger@public.cloak)
  22. # [09:28] * Joins: zcorpan (~zcorpan@public.cloak)
  23. # [09:31] * Joins: zcorpan_ (~zcorpan@public.cloak)
  24. # [09:35] * Quits: zcorpan (~zcorpan@public.cloak) (Ping timeout: 180 seconds)
  25. # [09:43] * Joins: glenn (~gadams@public.cloak)
  26. # [09:50] * Quits: glenn (~gadams@public.cloak) (Ping timeout: 180 seconds)
  27. # [10:25] * Joins: dom (dom@public.cloak)
  28. # [10:45] * heycam is now known as heycam|away
  29. # [11:29] * Joins: AutomatedTester (~AutomatedTester@public.cloak)
  30. # [12:29] * Joins: abarsto (~abarsto@public.cloak)
  31. # [12:29] * abarsto is now known as ArtB
  32. # [13:25] * Disconnected
  33. # [13:27] * Attempting to rejoin channel #testing
  34. # [13:27] * Rejoined channel #testing
  35. # [13:30] * Quits: krijn (~krijnhoetmer@public.cloak) (Ping timeout: 180 seconds)
  36. # [13:55] * Joins: gitbot (~gitbot@public.cloak)
  37. # [13:55] -gitbot:#testing- [web-platform-tests] jgraham closed pull request #307: Add additional tests for the named getter in form with other listed elements. (master...form-named-getter-listed) https://github.com/w3c/web-platform-tests/pull/307
  38. # [13:55] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
  39. # [14:28] <jgraham> darobin: Uh, I meant the git sha1
  40. # [14:28] <darobin> oh
  41. # [14:28] <darobin> duh
  42. # [14:28] <darobin> yeah
  43. # [14:28] <darobin> smart man
  44. # [14:28] <jgraham> :)
  45. # [14:28] * Ms2ger feels like he's missed something
  46. # [14:29] <jgraham> Ms2ger: public-test-infra
  47. # [14:29] <Ms2ger> And thanks for the merge
  48. # [14:29] <jgraham> Thanks for the test (and the ones from yesterday)
  49. # [14:30] <jgraham> darobin: Oh and it looks like you only replied to me rather than to the whole list
  50. # [14:30] <darobin> jgraham: actually, you sent that to me :)
  51. # [14:30] <jgraham> Oh
  52. # [14:30] <jgraham> Dammit
  53. # [14:31] <jgraham> I keep doing that. This is the first time I know of that I haven't noticed before pressing send though
  54. # [14:33] <jgraham> OK, resent my message to public-test-infra with a bit more detail on what, exactly, I had in mind
  55. # [14:33] * Joins: gitbot (~gitbot@public.cloak)
  56. # [14:33] -gitbot:#testing- [web-platform-tests] AutomatedTester synchronize pull request #305: Adding visibility tests to show elements with HIDDEN attribute are not v... (master...hidden) https://github.com/w3c/web-platform-tests/pull/305
  57. # [14:33] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
  58. # [15:31] * Joins: glenn (~gadams@public.cloak)
  59. # [15:42] * Joins: gitbot (~gitbot@public.cloak)
  60. # [15:42] -gitbot:#testing- [web-platform-tests] AutomatedTester pushed 1 new commit to master: https://github.com/w3c/web-platform-tests/commit/d895e16504ca77ceb490b38a49b000d3e9539add
  61. # [15:42] -gitbot:#testing- web-platform-tests/master d895e16 AutomatedTester: Adding visibility tests to show elements with HIDDEN attribute are not visble
  62. # [15:42] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
  63. # [15:43] * Joins: gitbot (~gitbot@public.cloak)
  64. # [15:43] -gitbot:#testing- [web-platform-tests] AutomatedTester closed pull request #305: Adding visibility tests to show elements with HIDDEN attribute are not v... (master...hidden) https://github.com/w3c/web-platform-tests/pull/305
  65. # [15:43] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
  66. # [15:54] * Quits: darobin (rberjon@public.cloak) (Client closed connection)
  67. # [16:42] * Joins: Automate_ (~AutomatedTester@public.cloak)
  68. # [16:46] * Joins: Automat__ (~AutomatedTester@public.cloak)
  69. # [16:47] * Quits: AutomatedTester (~AutomatedTester@public.cloak) (Client closed connection)
  70. # [16:51] * Joins: botte (~botte@public.cloak)
  71. # [16:52] * Quits: botte (~botte@public.cloak) ("Page closed")
  72. # [16:52] * Quits: Automate_ (~AutomatedTester@public.cloak) (Ping timeout: 180 seconds)
  73. # [16:58] * Quits: Automat__ (~AutomatedTester@public.cloak) (Client closed connection)
  74. # [17:01] * Quits: zcorpan_ (~zcorpan@public.cloak) (Client closed connection)
  75. # [17:01] * Joins: zcorpan (~zcorpan@public.cloak)
  76. # [17:07] * Joins: AutomatedTester (~AutomatedTester@public.cloak)
  77. # [17:09] * Quits: zcorpan (~zcorpan@public.cloak) (Ping timeout: 180 seconds)
  78. # [17:32] * Joins: zcorpan (~zcorpan@public.cloak)
  79. # [17:43] * Joins: rhauck (~Adium@public.cloak)
  80. # [17:46] * Joins: krisk (~krisk@public.cloak)
  81. # [17:56] * Joins: thayakawa (~thayakawa@public.cloak)
  82. # [17:57] * Joins: kkershaw (~kkershaw@public.cloak)
  83. # [17:58] <krisk> Hello and welcome!
  84. # [17:58] <Ms2ger> Hello and goodbye :)
  85. # [17:58] <krisk> We can chat about HTML5 testing
  86. # [17:58] * Ms2ger wanders off to dinner
  87. # [17:59] * Quits: AutomatedTester (~AutomatedTester@public.cloak) (Client closed connection)
  88. # [18:00] <krisk> kkershaw any questions?
  89. # [18:02] <kkershaw> hi - sorry - just got a phone call when you sent your welcome. will be back in a minute.
  90. # [18:02] * Joins: jmdyck_web (~jmdyck_web@public.cloak)
  91. # [18:06] * Joins: botte (~botte@public.cloak)
  92. # [18:06] <kkershaw> back again - I was wondering about the review process for test submissions. For example - the opera submission of media tests has been in review for quite a while.
  93. # [18:07] <kkershaw> Is there an expectation about how quickly reviews will get done, the priority of doing them, etc. Is this all pretty much a best effort enterprise?
  94. # [18:08] <kkershaw> Also, how does one become a reviewer for someone else's submission?
  95. # [18:08] <Ms2ger> Best effort
  96. # [18:09] <Ms2ger> You become a reviewer by doing reviews :)
  97. # [18:15] <kkershaw> OK - on the pull request in question (#93), if I bring it up in Critic, there is a list of 5 people noted as "Needs Review From" and each has the comment of "Wake Up".
  98. # [18:15] <Ms2ger> I'm probably on that list
  99. # [18:16] <kkershaw> Yup. How did you and the other 4 get there?
  100. # [18:16] <kkershaw> And, it says
  101. # [18:16] <kkershaw> "Wake Up!" next to everybody's name....
  102. # [18:17] <Ms2ger> On https://critic.hoppipolla.co.uk/home, you can set up filters
  103. # [18:17] <Ms2ger> And whenever a PR is made that matches one of your filters, you're tagged as a reviewer
  104. # [18:18] <Ms2ger> So in particular, I'm tagged for everything in html/, even though I don't really know anything about media
  105. # [18:18] <kkershaw> that helps...thanks
  106. # [18:19] * Quits: rhauck (~Adium@public.cloak) ("Leaving.")
  107. # [18:19] <kkershaw> Given that, what has to happen before this particular pull request is actully brought into the master? What are the gating items?
  108. # [18:19] <Ms2ger> Someone has to step up to do the review
  109. # [18:19] <jgraham> The gating item is that someone actually does the review
  110. # [18:20] <kkershaw> just one person?
  111. # [18:20] <jgraham> Or a group of people
  112. # [18:20] <jgraham> It doesn't really matter if the review is done well
  113. # [18:20] <kkershaw> argh!
  114. # [18:20] <Ms2ger> Well, preferably, it's done well
  115. # [18:20] <Ms2ger> But a cursory review is better than no review
  116. # [18:21] <jgraham> kkershaw: So, the problem is, that we don't have any sticks here
  117. # [18:21] <Ms2ger> And not a lot of carrots either
  118. # [18:21] <kkershaw> sure - I'm beginning to understand about the sticks and carrots
  119. # [18:21] <jgraham> We can't impose a formal process and say "do it like this or badness will befall you"
  120. # [18:21] <jgraham> The only carrot that we have is that tests are good
  121. # [18:22] <jgraham> Once people are actually running the tests then they have a motivation to organise review because they want to get the tests into their automation
  122. # [18:22] <krisk> I expect to spend more time on this in the Fall
  123. # [18:22] <jgraham> But at the moment none of Gecko/WebKit/Blink run all the tests all the time
  124. # [18:23] <jgraham> I'm working on changing this for Gecko. Others are working on changing it for Blink/WebKit
  125. # [18:23] <jgraham> (possibly just Blink, I'm not sure)
  126. # [18:23] <kkershaw> Yup. So, in this case (PR #93), it's the content of the entire PR that will be merged into the master whenever a review is done.
  127. # [18:23] <Ms2ger> Yes
  128. # [18:24] <jgraham> Right
  129. # [18:24] <Ms2ger> (The fact that it's 17k lines of tests at once is not terribly helpful)
  130. # [18:25] <krisk> Microsoft also runs the tests, especially when we are implementing a standard
  131. # [18:25] <kkershaw> Yeah - small PRs are a little easier to deal with. OK, so if I said that I had reviewed all these tests and that they were good, would that trigger the merge?
  132. # [18:25] <jgraham> kkershaw: Yes
  133. # [18:26] <jgraham> If you are intending to review such a large number of tests, and particularly if you are intending to do it as a group, I recommend using critic since you can track what has been reviewed
  134. # [18:26] <jgraham> And the test author can track which issues they have fixed
  135. # [18:26] <jgraham> Without resorting to pieces of paper
  136. # [18:26] <kkershaw> Yeah, Critic looks good. Thanks for pointing me to that. But you don't know if I'm competent to review.
  137. # [18:27] <jgraham> Well I think that is another problem we have
  138. # [18:27] <jgraham> It's kind of easy for everyone to feel like "I'm not the best reviewer for this"
  139. # [18:28] <kkershaw> That doesn't matter? The code author just needs someone to vote in their favor and away we go...?
  140. # [18:28] <jgraham> Well of course it is better if the person reviewing has some understanding of the spec
  141. # [18:28] <jgraham> But I don't think you should need to understand the spec better to review tests than to write tests
  142. # [18:30] <kkershaw> So, if I review and I suggest a change, will it be up to the author to act on that suggestion? And they could either accept it, reject it, modify, etc..?
  143. # [18:30] <jgraham> So, for example, sometimes I have reviewed things where I haven't really read the spec before and taken the review as a chance to read the spec
  144. # [18:30] <jgraham> Right. If you raise an issue, they are expected to address it unless they can argue convincingly that it doesn't need to be addressed
  145. # [18:31] <jgraham> (there isn't a defined escalation process, but in practice it isn't a problem)
  146. # [18:31] <jgraham> Or someone else can come along and address the issue
  147. # [18:31] <jgraham> It doesn't have to be the original author
  148. # [18:32] <jgraham> (although github makes that a bit harder with most reviews because they are tied to personal forks)
  149. # [18:32] <jgraham> (which makes me sad)
  150. # [18:34] <kkershaw> Good - so here's an example. Having looked through some of the tests in the opera media submission, I noticed that they don't have the "href" tag that provides a spec reference link.
  151. # [18:34] <jgraham> Anyway, the general feeling that I'm trying to convey is that we're aiming for review of everything, but it isn't expected that review is perfect, more that it is best-effort, and that because of the difficult environment, we try to make the process as flexible as possible
  152. # [18:34] <Ms2ger> If you want those, you'll probably need to add them yourself :)
  153. # [18:34] <kkershaw> The guide on writing tests says that should be there. Is this a reasonable suggestion for a change? And Could I add it myself?
  154. # [18:34] <jgraham> Right, the whole href thing is controversial. See for example darobin's recent post to the list which suggets that these should be in an external file
  155. # [18:34] <jgraham> Or external system, really
  156. # [18:35] <kkershaw> Right....been following that and hoping for convergence but not really seeing it.
  157. # [18:35] <Ms2ger> It's not bad to have them (imo), but I don't see anyone rushing to add them after the fact :)
  158. # [18:35] <jgraham> So we haven't tried to enforce that. Also it makes importing previously written tests like these rather difficult since noone has organically decided to add those links
  159. # [18:36] <jgraham> in their own testsuites
  160. # [18:36] <jgraham> So, my feeling about that specific example is that if you want that data you would be best off first getting those tests merged and then submitting the assert links as a seperate PR
  161. # [18:37] <jgraham> I doubt that zcorpan is going to spend his time adding them
  162. # [18:37] <jgraham> (and I imagine that if he did, Opera would regard that as a low value activity)
  163. # [18:38] <kkershaw> Understood. Sorry to fixate on this but the people who hold carrots and sticks out for me are often REALLY interested in assessing how well tests provide coverage of a spec. It's tough to answer that question without this kind of linkage.
  164. # [18:38] <jgraham> Sure
  165. # [18:39] <jgraham> I think that disconnect is why darobin suggests an external tool
  166. # [18:39] <kkershaw> Now for something completely different
  167. # [18:39] <jgraham> (personally I'm interested in trying a different approach to this problem via instrumented builds. But that's for the future and may not work anyway)
  168. # [18:40] <jmdyck_web> jgraham: "darobin's recent post to the list
  169. # [18:40] <jmdyck_web> ": public-test-infra?
  170. # [18:40] <kkershaw> I joined this chat for the first time 2 weeks ago - 2 colleagues of mine are on the call today. Brian Otte and Takashi Hayakawa - both from CableLabs. I was going to ask if either of them had questions for this group.
  171. # [18:41] <Ms2ger> (Welcome!)
  172. # [18:41] <jgraham> jmdyck_web: http://www.w3.org/mid/52176470.8040000@w3.org
  173. # [18:41] <jgraham> (yes)
  174. # [18:41] <thayakawa> Oops... hi all!
  175. # [18:41] * jgraham waves
  176. # [18:42] * Joins: jhammel (~jhammel@public.cloak)
  177. # [18:42] <thayakawa> kershaw has been covering most of the question, so not much new from me.
  178. # [18:43] <thayakawa> I just want to add that...
  179. # [18:43] <thayakawa> The test-spec relationship would be very helpful for me from a developer's viewpoint.
  180. # [18:45] <Ms2ger> Which part of the spec a test is testing, you mean?
  181. # [18:45] <thayakawa> Yes
  182. # [18:46] <Ms2ger> I haven't found it very useful myself, but ymmv, of course
  183. # [18:46] <kkershaw> I was curious about whether there's expertise in this chat community about the IDLHarness tool. I was trying to understand what it does, how it works, and so on.
  184. # [18:46] <Ms2ger> I've worked on it
  185. # [18:47] <Ms2ger> It's mostly Aryeh's work, though, and he isn't around often
  186. # [18:47] <jgraham> Yeah, I discussed it as a posibility before he implemented it. So I assume it is quite like we discussed, but I don't remember lots of the details :)
  187. # [18:48] <Ms2ger> The idea is that you paste in the IDL snippets from a spec, and it automatically tests those parts of the WebIDL spec that it can
  188. # [18:48] <kkershaw> Is there any written doc on it you could point us to?
  189. # [18:48] <krisk> I have some test for this that make use of the IDL harness as well that I plan on submitting
  190. # [18:48] <Ms2ger> There's some in the file itself
  191. # [18:49] <Ms2ger> And there's some tests that use it in the repo
  192. # [18:50] * Quits: dom (dom@public.cloak) ("")
  193. # [18:50] <kkershaw> OK - thanks.
  194. # [18:51] <Ms2ger> If you have particular questions, I'm always happy to try to answer :)
  195. # [18:53] <kkershaw> Thanks, I really appreciate the insight and good responses from all of you. I need to head off to a meeting for now. Cheers...
  196. # [18:53] * Quits: botte (~botte@public.cloak) ("Page closed")
  197. # [18:54] * Joins: rhauck (~Adium@public.cloak)
  198. # [18:55] <Ms2ger> See you
  199. # [18:55] * Quits: thayakawa (~thayakawa@public.cloak) (" HydraIRC -> http://www.hydrairc.com <- The professional IRC Client :D")
  200. # [18:56] <Ms2ger> We really want your tests ;)
  201. # [18:58] <MikeSmith> jgraham: I asked Ms2ger and gsnedders about this already, but I want to get your feedback too: I'd like to add a web-platform-tests/conformance-checkers directory for managing test cases for the validator
  202. # [18:59] <MikeSmith> for (non-automated, non-testharness.js) document-conformance tests
  203. # [18:59] <jgraham> MikeSmith: If you plan to use them, that sounds reasonable to me
  204. # [19:00] <MikeSmith> OK
  205. # [19:00] <MikeSmith> I can set up a file-naming convention or manifest convention or whatever to further distinguish them from the other tests in the repo
  206. # [19:01] <MikeSmith> to keep them out of the way
  207. # [19:02] <Ms2ger> If they're in their own dir, do whatever, IMO
  208. # [19:02] <MikeSmith> I'm already using them btw. The value to me in having the in the w-p-t repo is to piggback off the network effects of testtwf etc.
  209. # [19:02] <MikeSmith> ok
  210. # [19:02] <jgraham> I guess a file naming convention helps, if you don't mind doing it
  211. # [19:02] <jgraham> Or call them all -manual
  212. # [19:02] <jgraham> Or whatever we decided for that
  213. # [19:03] <MikeSmith> -manual would work fine for me
  214. # [19:04] <MikeSmith> as long as I could also do e.g., 001-notvalid-manual.html
  215. # [19:04] <MikeSmith> because I'm already using "-notvalid" to flagging tests that are intentionally supposed to fail the validator
  216. # [19:04] <MikeSmith> and I have a test runner already that handles that
  217. # [19:05] <jgraham> Nice
  218. # [19:05] <MikeSmith> basically it's what we have been using already for regression testing of the valiator
  219. # [19:06] <jmdyck_web> are they actually manual? (or does manual not mean what i think it means?)
  220. # [19:07] <MikeSmith> jmdyck_web: they are non-manual in that they don't run under the testharness.js automation
  221. # [19:07] <MikeSmith> they are not browser tests
  222. # [19:08] <jmdyck_web> ok, so it's not a reflection of their automatability
  223. # [19:10] <MikeSmith> btw about the performance of the validator code, I ran the validator test runner on all 5404 HTML/XHTML test files in the w-p-t repo, and it takes about 13 seconds total to parse them all and report all them validation errors (17255 errors..). So, about 2 milliseconds per document (unless I counted wrong)
  224. # [19:10] <MikeSmith> jmdyck_web: yeah
  225. # [19:10] <MikeSmith> jmdyck_web: but there are no automation hooks of any kind in the test files themselves
  226. # [19:11] <MikeSmith> and none needed, as far as the validator goes
  227. # [19:11] <MikeSmith> they are just arbitrary HTML files
  228. # [19:12] <Ms2ger> MikeSmith, so you're testing valid/not valid or more than that?
  229. # [19:12] <jmdyck_web> Is there an indication somewhere of how the notvalid files are not valid?
  230. # [19:12] <jmdyck_web> (or what sort of validation error to expect?)
  231. # [19:13] <MikeSmith> Ms2ger: just valid/not valid at this point. The test runner doesn't get down to testing the actual errors at granularity
  232. # [19:13] <Ms2ger> MikeSmith, understood
  233. # [19:15] <MikeSmith> Ms2ger: but Henri did once write a different test mechanism that provides more specific checking against the particular expected error. So I may be able to hook into that later
  234. # [19:15] <jmdyck_web> MikeSmith: and this is the w3c validator, aka validator,nu?
  235. # [19:15] <MikeSmith> jmdyck_web: yeah
  236. # [19:16] <MikeSmith> jmdyck_web: and that answer to your earlier question is the same one I just gave Ms2ger :)
  237. # [19:18] <MikeSmith> ok the other test harness is here:
  238. # [19:18] <MikeSmith> https://github.com/validator/validator/tree/master/test-harness
  239. # [19:19] <MikeSmith> it produces output like this:
  240. # [19:19] <MikeSmith> $ python validator-tester.py checkall
  241. # [19:19] <MikeSmith> http://simon.html5.org/test/validator/character-encoding/non-ascii-past-512.html Expected 1,512;1,512: No explicit character encoding declaration has been seen yet (assumed “windows-1252”) but the document contains non-ASCII. but saw no errors.
  242. # [19:19] <MikeSmith> http://simon.html5.org/test/validator/character-encoding/non-ascii.html Expected 1,23;1,23: No explicit character encoding declaration has been seen yet (assumed “windows-1252”) but the document contains non-ASCII. but saw no errors.
  243. # [19:19] <Ms2ger> The parser tests have... something
  244. # [19:26] <jgraham> Yeah, someone posted a patch to update some of the information there
  245. # [19:26] <jgraham> I'm not sure what happened to it
  246. # [19:27] <jgraham> I think most people were ignoring the errors stuff because it makes writing the tests much harder and isn't a win except for the validator
  247. # [19:42] <jmdyck_web> MikeSmith: should conformance-checkers directory be within web-platform-tests/html? Or is it for checking conformance to other specs too?
  248. # [19:43] <jmdyck_web> (or should it be named html-conformance-checkers?)
  249. # [19:48] <MikeSmith> jmdyck_web: other specs too
  250. # [19:53] <jmdyck_web> cool. such as?
  251. # [19:55] <Ms2ger> SVG?
  252. # [20:11] * Joins: hober (~ted@public.cloak)
  253. # [20:20] * Quits: krisk (~krisk@public.cloak) (Ping timeout: 180 seconds)
  254. # [20:31] * Quits: glenn (~gadams@public.cloak) (Client closed connection)
  255. # [20:34] * Joins: thayakawa (~thayakawa@public.cloak)
  256. # [20:51] * Quits: ArtB (~abarsto@public.cloak) ("Leaving.")
  257. # [21:04] * Quits: jmdyck_web (~jmdyck_web@public.cloak) (Ping timeout: 180 seconds)
  258. # [21:06] * Quits: zcorpan (~zcorpan@public.cloak) (Client closed connection)
  259. # [21:06] * Joins: zcorpan (~zcorpan@public.cloak)
  260. # [21:19] * Joins: gitbot (~gitbot@public.cloak)
  261. # [21:19] -gitbot:#testing- [web-platform-tests] Ms2ger opened pull request #308: Make the last argument to DOMImplementation.createDocument optional. (master...createDocument-optional) https://github.com/w3c/web-platform-tests/pull/308
  262. # [21:19] * Parts: gitbot (~gitbot@public.cloak) (gitbot)
  263. # [22:26] * Quits: kkershaw (~kkershaw@public.cloak) (" HydraIRC -> http://www.hydrairc.com <- The alternative IRC client")
  264. # [22:34] * Quits: jhammel (~jhammel@public.cloak) ("biab")
  265. # [23:30] * Joins: jhammel (~jhammel@public.cloak)
  266. # [23:35] * Parts: jhammel (~jhammel@public.cloak) (jhammel)
  267. # [23:48] * Joins: botte (~botte@public.cloak)
  268. # [23:49] * Joins: rhauck1 (~Adium@public.cloak)
  269. # [23:50] * Quits: Ms2ger (~Ms2ger@public.cloak) ("nn")
  270. # [23:54] * Quits: rhauck (~Adium@public.cloak) (Ping timeout: 180 seconds)
  271. # Session Close: Wed Aug 28 00:00:00 2013

The end :)