← Previous day | Today | Next day → | Search | Index
All times shown according to UTC.
Time | Nick | Message |
---|---|---|
16:43 | SirStan | chris++ |
20:27 | chris | morning |
20:38 | ebegin | hi all! |
20:47 | eiro | http://search.cpan.org/dist/in[…]t/lib/indirect.pm |
20:47 | hello guys | |
20:48 | it would be cool to use this pragma everywhere in koha, tracking all the "new CGI" syntax | |
20:48 | 'night all :) | |
20:52 | chris | oh good idea eiro |
20:52 | hiya lori | |
20:52 | LBA | howdy, happy father's day to all you daddies |
20:54 | chris | not fathers day here yet, not until september, i wonder if i can get 2 |
20:54 | :) | |
20:55 | LBA | go for it! |
20:55 | chris | one for each child maybe |
20:55 | :) | |
20:57 | richard | hi |
20:58 | ebegin | Did anyone noticed that the koha-zebra-ctl.sh and koha-zebraqueuequeue-ctl.sh was moved for 3.00.02? They are now located in misc/bin (was in bin before) |
20:59 | s/noticed/notice s/was/were/ | |
21:00 | Unfortunatly, I think the install script was not modified accordingly | |
21:11 | chris | ahh no i didnt notice that |
21:11 | you might want to drop Henri Damien a note | |
21:40 | hdl_laptop | thx ebegin for noticing. |
21:40 | I thought it was thus already for 3.0.1 | |
22:13 | chris | back |
22:13 | that took longer than i planned | |
22:20 | i need more coffee before i answer more mailing list questions i think | |
22:20 | collum: thanks for answering nelson as well | |
22:21 | collum | You expressed what I was thinking far more eloquently. |
23:52 | SirStan | What list was this eloquent reply? |
23:53 | ah | |
00:13 | chris: opac search (via nytprof): http://vtwireless.com/nytprof/ | |
00:14 | the times there are 3x normal.. | |
00:20 | chris | theres something odd going on |
00:21 | print $query->header; | |
00:21 | should not take 2 seconds | |
00:21 | (thats using CGI.pm) one of the most widely used modules | |
03:28 | Amit | hi chris, brendan, mason |
03:28 | morning #koha | |
03:28 | hi richard | |
03:28 | richard | hola amit |
03:29 | mason | hiya amit, good morning to you too |
03:30 | and atz and richard too ;) | |
03:32 | Amit | hi atz |
03:34 | richard | hi mason |
03:51 | brendan | heya amit -- richard and mason too :) |
03:51 | Amit | heya brendan |
03:53 | chris | hi brendan and amit |
03:54 | brendan | good evening chris -- |
03:54 | Almost time for me to go to bed. Too much sun today | |
03:54 | SirStan | mmm was a beautiful day up here. |
03:55 | before i roll my own -- anyone have a small script that I can scan isbn, look up in my catalog, find a record, and then scna in a barcode? | |
03:55 | So I can roll through linking 9k records to barcodes | |
03:55 | Amit | hi sirstan |
03:55 | SirStan | hai Amit. |
03:57 | chris | not me SirStan |
03:58 | SirStan | no worries |
03:59 | chris: that nytprof was running 3x slower than realtime.. | |
03:59 | it executed in 2 seconds -- but the nytprof said 6 seconds. | |
03:59 | so any timings are 300% elevated | |
03:59 | chris | stills seems very out that a print header line would take 2seconds |
04:00 | all that line is doing is print "Content type:html\n\n"; | |
04:00 | SirStan | ah. |
04:00 | my bad. | |
04:00 | I didnt pipe it to devnull | |
04:00 | chris | ahh |
04:00 | SirStan | actually that linepritns EVERYTHING |
04:00 | so it took a long time to print to a buffer | |
04:00 | and dispaly | |
04:01 | chris | right |
04:01 | well that subroutine prints everything | |
04:01 | SirStan | mhm |
04:01 | chris | print $query->header |
04:01 | SirStan | , $html |
04:01 | chris | ah right you are |
04:01 | in that case, not too bad | |
04:02 | there is a patch already sent for master | |
04:02 | for authorised values | |
04:02 | SirStan | nice! |
04:02 | i didnt see it | |
04:02 | did it hit devel? | |
04:03 | chris | http://git.koha.org/cgi-bin/gi[…]8db9ace8c21b63338 |
04:03 | koha-patches | |
04:03 | might be a list worth subscribing too all patches get sent there, and then accepted into the repo | |
04:03 | SirStan | Fredric's post? |
04:03 | from jun14th? | |
04:03 | chris | galen reworked it |
04:04 | but yep | |
04:04 | http://git.koha.org/cgi-bin/gi[…]?p=Koha;a=summary | |
04:05 | SirStan | nice. |
04:05 | i get the patch emails. | |
04:05 | i even subbmited a few ~6mos ago | |
04:05 | chris | ah yep, you are in the history :) |
04:05 | SirStan | but didnt get a warm fully feeling from that :D. |
04:06 | ill keep my changes local | |
04:06 | chris | oh? |
04:08 | as far i can see they were all accepted | |
04:08 | SirStan | yea.. they were |
04:09 | some of the resulting discussion was BLAR WHY DID YOU DO THAT. | |
04:09 | on minor changes. | |
04:09 | chris | i wouldnt let that stop you |
04:10 | if it was blar, and then not accepted that would be a different story | |
04:17 | mason | i can honestly say that all the critique ive had about my commits have been valid and appreciated |
04:22 | pianohacker1 | $SirStan->{'name'} ? Trying to find you in the history |
04:23 | chris | number 75 pianohacker |
04:24 | SirStan | just a couple minor edits piano. |
04:24 | ~25 lines committed | |
04:24 | ive done more on the wiki that in code | |
04:26 | pianohacker | chris: ah, thanks |
04:27 | SirStan: Community peer review has its ups and downs, don't it? | |
04:28 | (usually positive ones, but not always) | |
04:30 | SirStan | pianohacker: the lib world is wierd. |
04:31 | Koha is slow and bloated. | |
04:31 | yet no one cares.. and are defensive if you suggest that is the case. | |
04:32 | chris | who doesnt care? |
04:32 | SirStan | not you. |
04:32 | seriously... your amazing for that memoize hack | |
04:32 | but local librarians.. and way back when i suggested the 500mb koha install was bloated.. | |
04:33 | mason | :) |
04:33 | SirStan | "buy faster hardware" |
04:33 | mason | or delete the other language dirs, after you install... |
04:34 | SirStan | mmm |
04:34 | chris | thats being worked on, the debian packages will ship the localisation as seperate packages |
04:36 | i dont know a single developer who thinks koha is fast enough, if thats any consolation for you :) | |
04:37 | mason | but haaay, dont let a *perceived* harsh critique of your work sabotage your desire to contribute to the project |
04:37 | pianohacker | Koha, being a somewhat specialized software project, |
04:38 | mostly only gets really big improvements when someone is willing to pay for them | |
04:38 | SirStan | to liblime? |
04:39 | pianohacker | To catalyst, katipo, biblibre, etc. |
04:39 | and liblime, yes | |
04:39 | SirStan | who/what is catalyst? |
04:39 | ive heard the other names | |
04:39 | chris | ive worked for katipo, then liblime, now catalyst |
04:39 | SirStan | wow :) |
04:40 | mason | i think peoples satisfaction/success with the koha-project is largely what they make it.... |
04:40 | SirStan | mason: koha is amazing. as an "outsider" to the library world who has worked with many-a-bad ILS systems from major US vendors... |
04:40 | mason | glass half-full / glass half-empty etc |
04:40 | SirStan | Koha rocks. |
04:40 | pianohacker | (note: above comments based on 1.5 years of working with the koha project, take them with a full container of iodized salt) |
04:41 | chris | ill see your 1.5 and raise you 10 :-) |
04:41 | pianohacker | chris: well, exactly :) |
04:41 | chris | SirStan: there are few projects on the go |
04:41 | pianohacker is doing some neat stuff with AJAX and circulation | |
04:42 | ive been expirementing with template-toolkit instead of html::template | |
04:42 | and we want to add DBIx::Class in too | |
04:42 | + mod_perl safing it | |
04:43 | pianohacker | Though ironically doing paid work on Koha is interfering with working on Koha for fun (including getting a nice version of the ajaxcirc patches sent in) |
04:43 | chris | which should make it smaller/faster |
04:43 | heh, thats the problem | |
04:43 | no one wants to pay for the optimisation stuff, so you end up working on data migrations instead | |
04:43 | SirStan | our issue here is that our users are on dial up. |
04:43 | and koha taking 400k/page is a barrier | |
04:44 | chris | yep |
04:44 | you have got expires headers set up eh? | |
04:44 | SirStan | yea. but that doesnt help first load |
04:44 | yui+jquery+custom js | |
04:44 | chris | yeah, id set them to like a year |
04:44 | the other option | |
04:44 | pianohacker | SirStan: And even if you set expire headers up, that's still ~40 round trips per page just to get 302 not modified |
04:45 | chris | is koha is designed to work with /theme/language/ |
04:45 | you could make cut down versions of the opac ... maybe a minimal theme | |
04:45 | i think mobile users would like that too | |
04:45 | back in the day | |
04:46 | koha was designed precisely to work on dialup | |
04:46 | and not fast dialup either ... dialup that could do 33k3 max | |
04:46 | SirStan | i know :) |
04:46 | chris | because the electric fences messed up any compression |
04:46 | SirStan | actually |
04:46 | chris | tick tick tick on the phone line doesnt help |
04:46 | SirStan | same issue here |
04:46 | ironically | |
04:47 | chris | i do think, that with not too much effort |
04:47 | pianohacker | SirStan: At the very least, you could minimize the JS and CSS and combine them both into just two files |
04:47 | chris | *nod* |
04:47 | mod_deflate | |
04:47 | too | |
04:47 | but now i must go pick up my son ... id encourage checking out mod_deflate and/or mod_gzip | |
04:48 | and making the expires headers a long long time from now :) | |
04:48 | SirStan | hmm. |
04:48 | mhm* -- ive got it shrunk down quite a bit | |
04:48 | just running everything through minimizers helps alot | |
04:48 | pianohacker | Yup |
04:49 | Some of the included JS libraries are pre-minimized, but Koha's own JS is not (to ease development) | |
04:50 | It probably wouldn't be horribly difficult to write a script that generated a version of the prog theme with minimized HTML/CSS/JS | |
04:50 | SirStan | took me about 30 minutes |
04:51 | pianohacker | Well, there you go then. Please think about submitting it in |
05:56 | chris | back |
07:02 | hdl_laptop | hi |
07:03 | Elwell_ | morning all |
07:03 | chris | hi Elwell_ and hdl_laptop |
07:08 | hi nicomo and Kivutar | |
07:08 | Kivutar | hello |
07:08 | nicomo | hi chris |
07:10 | Elwell_ | Git Q - does it include old history (from SVN) or was it a clean export? |
07:10 | chris | it includes old |
07:10 | (from cvs) | |
07:10 | git.koha.org has back to 2000 | |
07:11 | Elwell_ | ace. Now I need to make my script work with git :-) |
07:11 | chris | i have another git repo for the cvs repository prior to putting koha up on sourceforge in 2000 |
07:11 | http://stats.workbuffer.org/ | |
07:11 | Elwell_ | chris: how many developers at that time? (pre 2k) |
07:12 | chris | http://stats.workbuffer.org/ko[…]1012/authors.html |
07:12 | 2 main ones | |
07:13 | youve seen the history file eh? | |
07:13 | Elwell_ | yeah |
07:13 | chris | basically i used those 2 repos, plus the mailing lists to create it |
07:16 | Elwell_ | chris: Trying to show 'commitment' to open source projects - ie how many developers stick with a project longf term (and then see if they're being paid to do so...) |
07:17 | is 'open source' really open :-) | |
07:17 | chris | ahh right |
07:17 | Elwell_ | pitching a paper at a management journal |
07:17 | chris | cool |
07:18 | most of the time i have been paid to do at least some koha work, but i had a year, april 2008 - april 2009 when i wasnt being paid to do any work | |
07:19 | (any work on koha that is) | |
07:19 | Elwell_ | 'frinstance > 30% of the last Linux Kernel (by changeset) was paid for by RH,Intel,Novell,IBM |
07:20 | chris | http://stats.workbuffer.org/ko[…]-now/authors.html |
07:20 | that might be interesting, the date of first commit, and the last one | |
07:20 | actually i need to regen that, 2 secs | |
07:20 | Elwell_ | do you keep any 'company' or sponsoring info to map usernames to? |
07:21 | chris | only in my head |
07:21 | Elwell_ | np |
07:21 | chris | i could tell you pretty much all of them if you wanted |
07:22 | Elwell_ | I'll have a poke around once I do some Real Work :-/ |
07:22 | chris | :) |
07:23 | k regened the stats from the latest rebase | |
07:24 | hi paul_p | |
07:24 | paul_p | hi chris. |
07:33 | chris | very cool logo nicomo :) |
07:33 | nicomo | thanks chris |
07:33 | was great fun to make actually | |
07:34 | chris | Elwell_: i think for Koha > 95% was paidfor by libraries |
07:34 | via katipo, liblime, biblibre etc | |
07:34 | actually maybe 90% | |
07:34 | nicomo | Elwell_: and that seems pretty natural since it's not a general public, general purpose software |
07:35 | chris | but thats just a total guess |
07:35 | and it gets tricky, if someone is paid for a 40hour week and works 60 hours on koha etc | |
07:37 | i think in koha, its quite interesting, because users contribute more than say linux | |
07:37 | ie there are millions of linux users, and the percentage that contribute in either time or money back to the project would be much lower than for koha | |
07:38 | and once again, i think as nicomo has said, it is because its a specialised piece of software | |
07:38 | Elwell_ | chris: yeah thats what I'm expecting |
09:07 | |Lupin| | hello there ! |
09:07 | chris | hi |Lupin| |
09:08 | |Lupin| | hi chris ! how are you ? |
09:09 | (Is it possible in koha to see all the records that are present in the catalogue?) | |
09:10 | chris | not in the web interface |
09:10 | |Lupin| | ah too bad.. The other way is in SQL, then ? |
09:10 | chris | *nod* |
09:11 | |Lupin| | or s there a third way ? |
09:11 | chris | thats the only way i can think of |
09:11 | |Lupin| | okay |
09:12 | I hav ethis strange error message I already had: Can't call method "as_usmarc" on an undefined value at /usr/local/stow/koha-3.00.01-stable/lib/C4/Search.pm line 2119. | |
09:13 | chris | when you are doing a search? |
09:13 | |Lupin| | I'm wondering if koha's tables haven't been corrputed and if there is some sanity checks I could run, or perhaps some scripts to regenerate sane tables;.. |
09:13 | chris: yes | |
09:13 | chris | it could be just one record, with an invalid marc record |
09:13 | |Lupin| | chris: okay |
09:14 | chris | do you have xslt switched on? |
09:14 | (that question on applies if you are searching in the opac) | |
09:14 | |Lupin| | but I also have the error when doing a search which is supposed to give aback a record provided by the french national library |
09:15 | not sure this record has been successfully imported, though | |
09:15 | chris: I am searching in the opac and I don't know how to see whether xslt is turned on o off ... | |
09:16 | chris | if you search in the librarian interface, same search, do you get the same error? |
09:16 | |Lupin| | chris: I'm checking this... didn't know it could make a difference... |
09:17 | chris | it probably wont in this case, but if you do have xslt turned on (in the opac tab in system preferences) |
09:17 | the opac and the librarian interface will render results differently | |
09:18 | this sounds like it couldnt create a valid MARC::Record object | |
09:18 | so when it tried to call as_usmarc it blew up | |
09:19 | |Lupin| | in the librarian interface it doesn't find anything |
09:19 | chris | and usually it cant make MARC::Record object if there is something invalid in what it is using (the marcxml column in biblioitem) |
09:19 | you do a select marcxml from biblioitems where biblionumber = somenumber | |
09:20 | |Lupin| | I can check the table directly... |
09:20 | chris | and quickly eyeball that column |
09:20 | id look for missing closing tags | |
09:20 | or something like that | |
09:21 | |Lupin| | thre are five results |
09:21 | chris | orly? |
09:22 | |Lupin| | yes |
09:22 | it's normal | |
09:22 | chris | three biblioitems for the same biblionumber? |
09:22 | |Lupin| | I just tried to import a few records semi-manually... |
09:22 | chris | ah |
09:22 | in which case i have no idea which one it will actually be trying to use when rendering the results | |
09:23 | |Lupin| | okay |
09:23 | let me see more precisely what happens... | |
09:27 | chris: wget http://inova.snv.jussieu.fr/log | |
09:28 | chris: would ou mind taking just one minute to have a look to it and tell me hat you think about it, please ? | |
09:30 | fredericd | |Lupin|: \n in MARC records sound not good! |
09:31 | |Lupin| | fredericd: yes I was surprised to see them too, but I thought it may be mysql printing... ? |
09:32 | chris | hmm |
09:32 | |Lupin| | I didn't touch the marcxml directly, never. The onlyy tol I used to import was bulkmarcimport |
09:32 | and by the way it's koha 3.00.01 | |
09:33 | chris | cant see anything obvious, BUT, you shouldnt have 5 biblioitems attached to the same biblio |
09:34 | |Lupin| | chris: I think it's because when I did the bulkmarcimportfor the first time, it gave a lot of warnings, so I ran it several times |
09:34 | chris | it should then make different biblio's |
09:34 | ie there should be a 1-1 relationship | |
09:35 | |Lupin| | chris: does this mean that there is a bug in bulkmarcimport ? |
09:35 | (the data were the same for each run) | |
09:35 | chris | there might have been for 3.0.1 |
09:36 | i havent seen it make multiple biblioitems attached to one biblio before though | |
09:36 | |Lupin| | chris: so... should I update to 3.00.02 ? |
09:37 | chris | the code expects a one to one relationship, so i have no idea which one of those 5 will be actually being returned |
09:38 | probably the first one | |
09:38 | |Lupin| | chris: and also, I don't think the same marc xml appears five times. From what I can see, I'd say there are three occurrences for "Fille de rouge" which is a marc record I have built manually, and two occurrences of "La marche consulaire", which come from the national library |
09:38 | chris | yes but do they all have the same biblionumber? |
09:38 | what is the sql you used to get those 5? | |
09:39 | |Lupin| | select marcxml from biblioitems |
09:39 | chris | ahh so you didnt do where biblionumber=something? |
09:40 | |Lupin| | and the biblionumbers are distinct |
09:40 | from 7 to 11 | |
09:40 | chris | right i misunderstood i thought you were just trying to get the one problem record |
09:40 | |Lupin| | chris: no because I didn't know the biblionumbers... |
09:41 | chris: no it's my fault, I'm not clear enough myself to explain things the right way | |
09:41 | chris: what I'm trying to do is | |
09:41 | 1. See which titles are present in the catalog | |
09:41 | chris | right |
09:41 | |Lupin| | 2. understand why a search for a word does not work. |
09:41 | chris | select title from biblio; |
09:42 | is the easy way to spot that one | |
09:42 | what is the word you are searching for? | |
09:42 | |Lupin| | I tried fille and consulaire |
09:42 | both fail | |
09:43 | (in the OPAC they give the message I have shown previouslèy) | |
09:43 | So "Fille de rouge" indeed appears three times, whereas "La marche consulaire" appears once. | |
09:43 | s/once/twice/ | |
09:44 | chris | right |
09:44 | |Lupin| | so now 1 is solved and 2 remains. |
09:45 | chris | how about we do a bulkmarcimport -d |
09:45 | to clear it out and just load in one record | |
09:46 | |Lupin| | yes, good idea |
09:46 | let me do that. | |
09:48 | the -d didn't seem to work, because bulkmarcimport displayed its help | |
09:48 | chris | ahh you need to do bulkmarcimport -d -file whatever_the_file_is_called -n 1 |
09:49 | |Lupin| | ah okay |
09:49 | I thought -d was enough | |
09:50 | chris | ah no, sorry, currently you can only clear it out as part of an import |
09:51 | |Lupin| | yep |
09:51 | my mistake | |
09:51 | no problem | |
09:52 | it worked, modulo a lot of warnings ! | |
09:52 | okay, now I indeed have just one title. | |
09:53 | chris | using zebra? |
09:53 | |Lupin| | no |
09:53 | normally, no. | |
09:53 | I did, but now I do not anylonger.. | |
09:54 | chris | in that case you will need to run rebuild_nozebra.pl |
09:55 | with -r for reset | |
09:55 | |Lupin| | is there a way to ensure that koha really does not use zebra ? |
09:56 | oh | |
09:56 | chris | yep, it wont use it you set the systempreference to not use it |
09:56 | |Lupin| | even when zebra is not use there is a command to run to rebuild the index ? |
09:56 | chris | only after an import |
09:56 | |Lupin| | chris: is the system preference stored in the database somehow ? |
09:56 | chris | yes |
09:56 | |Lupin| | easier for me than using the web interface |
09:57 | ah that's the piece I missed | |
09:57 | chris | (note rebuild_nozebra not rebuild_zebra) |
09:58 | you can actually look in the nozebra table | |
09:59 | after you have rebuilt the index (it builds the index automatically if you add records through the web interface) | |
09:59 | the nozebra table is what it searches | |
09:59 | |Lupin| | I see |
10:00 | chris: maybe bulkmarcimport could run the right command automatically, or perhaps even have a command-line option to do so ? | |
10:04 | chris | it could, but you need to run different ones whether you are using zebra or not |
10:05 | |Lupin| | chris: and the script can not figure out whether zebra is used or not ? |
10:06 | and now the search in the opac works ! Thanks a lot for your help chris, and thanks especially for your patience ! I feel rather dumb regarding koha | |
10:06 | chris | yep, it could, but personally i like running it as 2 separate jobs |
10:08 | |Lupin| | chris: ok |
10:08 | chris | it would be really cool if it could be parallelised |
10:09 | so that they can run at the same time | |
10:09 | |Lupin| | this coudl be done ? |
10:09 | chris | using different threads |
10:09 | |Lupin| | I thought there is a dependency between import and rbuild ? |
10:09 | chris | yep, you would need to load at least one record, before starting the index |
10:09 | |Lupin| | or may bi you mean that once a record has been imported, this record can be indexed and the next one could then be imported at the same one ? |
10:09 | chris | but you dont need to wait for 80,000 |
10:10 | before you start indexing | |
10:10 | yep | |
10:10 | |Lupin| | understood |
10:10 | yes, would be nice | |
10:11 | chris | the list of things to do always grows :) |
10:11 | |Lupin| | but for tbhe newbie like me, it is not obvious that importing and indexing should be 2 distinct jobs. But I can very well understand that for those who are more familiar with the topic this makes sense |
10:12 | chris | if you run with zebra on |
10:12 | the rebuild_zebra.pl needs to be run as a cron job | |
10:13 | |Lupin| | chris: because it is not called when records are added through the web interface, or for another reason ? |
10:13 | chris | yeah, because its slow |
10:13 | so you dont want to slow down the web interface waiting for the indexer to run after every record is changed | |
10:13 | so it runs in the background | |
10:14 | |Lupin| | chris: one other possibility would be to have it run in the background by the web interface, which wouldn't wait for it to terminate... would that make sense ? |
10:14 | chris | it is one possibility |
10:14 | but you would get race conditions | |
10:15 | if 20 people are changing things at once, you might end up with 20 things trying to build/update indexes at once | |
10:15 | |Lupin| | chris: with the cron job you are sure you don't ? just because it runs during the night when every librarian sleeps, or for a stronger reason ? |
10:16 | chris | zebra doesnt deal with that well |
10:16 | no it runs every 5 mins (or 1 min or 10 mins or whenever) | |
10:16 | but only runs once | |
10:16 | |Lupin| | yes, I understand it now, thanks ! |
10:16 | chris | it uses the zebraqueue table to know which records it needs to index |
10:17 | so if one gets changed 5 times in 5 mins, it only gets indexed once (not 5 times) | |
10:17 | |Lupin| | chris: okay |
10:17 | but 5 changes are done to the zebraqueue table ? | |
10:17 | chris | yep |
10:17 | |Lupin| | got it ! good ! |
10:17 | thanks ! | |
10:29 | perhaps bulkmarcimport could by default display a warning saying that hte user should perhaps run rebuild_zebra or rebuild_nozebra, with a -q option or something like that to disable the warning | |
10:30 | chris | the documentation should say that, but yes it wouldnt hurt for the scripts to say that also |
11:23 | heya jdavidb, all settled in? | |
11:24 | jdavidb | Getting there, chris, getting there. We still have many boxes around the apartment, though. |
11:25 | I've about got my commute figured out, though--when to be out there for which bus, etc--to get me here in the shortest time with the least crowd. | |
11:25 | chris | sweet |
11:26 | bout how long is the commute? | |
11:26 | jdavidb | About forty minutes, counting waits. short ride to the train station, three stops along the train line, then another short bus ride. |
11:27 | chris | thats not too bad |
11:28 | jdavidb | If I miss that first bus, I can walk to the train station, and get to work about thirty minutes later than usual. 3/4 mile or so walk. |
11:31 | chris | not to bad as long as its not raining :) |
11:32 | jdavidb | If it's raining, I just make dang sure *not* to miss that first bus. :P |
11:32 | chris | :) |
11:34 | jdavidb | It's apparently monsoon season here; I've seen more rain in the last week than I saw in a typical *year* in Abilene. Truly amazing. |
11:34 | Amit | hi jdavidb |
11:34 | jdavidb | Hi, Amit. :) |
11:44 | chris | ohh good idea to use Mollom |
← Previous day | Today | Next day → | Search | Index