← Previous day | Today | Next day → | Search | Index
All times shown according to UTC.
Time | Nick | Message |
---|---|---|
13:59 | kados | lea: hi there |
14:00 | morning foxnorth | |
14:01 | foxnorth | hey kados |
14:01 | lea | hi kados |
14:02 | kados i have something to run past you if you have a couple of minutes? | |
14:03 | kados | lea: sure |
14:07 | lea | Basically, in the summer my boss got the "online" version of the proprietary system we use for the libraries |
14:07 | it was an asp.net app and to my disgust we had to setup our very first wndows server | |
14:08 | to cut a long story short, the app doesn't work so he's on the lookout for aomething else | |
14:08 | He's willing to fork out _some_ money to get a new system in | |
14:08 | we have 2 problems as i can see: | |
14:08 | 1. Migration | |
14:08 | 2. Cost | |
14:09 | Now I'm very comfortable in the Opensource worls and my boss is too. he'd way rather we go for koha than anything else | |
14:09 | kados | with you so far |
14:10 | lea | during the summer though, i don't think koha had short loans and holidays which was pretty essential for our 3 libraries |
14:10 | and from my point of view, i couldn't quite get my head around how to set it all up correctly | |
14:10 | simply because I'm not a librarian | |
14:11 | so given that my boss has earmarked £5K for getting it sorted, what are our options? (If you know of any) | |
14:12 | kados | akk, phone call |
14:12 | just a sec | |
14:17 | lea | np :) |
14:17 | I heard an interesting webcast interview between a liblime guy and talis. Made me want koha even more. | |
14:32 | kados | lea: that guy was me! :-) |
14:33 | lea: did you look at the new 3.0 version yet? | |
14:52 | lea | oh thought it might have been you ;) |
14:52 | I haven't checked out V3 yet no. It's in SVN/CVS? | |
14:53 | right now $5K is about $10K. What would that buy? | |
14:53 | er i mean £5k | |
14:54 | i'd *really* like a koha system here | |
14:54 | kados | sure |
14:54 | well, if you're looking for prices for commercial support, I'd suggest contacting one of the koha companies | |
14:54 | http://koha.org/support/pay.html | |
14:55 | this channel is mainly for discussing development and technical issues | |
14:55 | V3 is in Git: http://git.koha.org | |
14:55 | http://wiki.koha.org/doku.php?[…]lopment:git_usage | |
14:56 | lea | ok. Not really looking for commercial support, just trying to work out how to get from A to B |
14:56 | although I guess there might be some migration work for someone if they want it | |
14:56 | kados | well, if you don't need commercial support, you won't need 10K :-) |
14:56 | koha's completely free | |
14:56 | so you can download and implement it yourself | |
14:57 | lea | yeah, but we have semi broken marc21 data |
14:57 | kados | most of the companies won't do data migration without some kind of commercial support agreement in place |
14:57 | lea | right ok |
14:57 | I'd mended most of it myself, it was just the subject tags | |
14:57 | with regards to implementing it myself, I got pretty close. | |
14:57 | kados | but if you're a perl hacker you can use MARC::Record on cpan to clean thigns up |
14:58 | or MarcEdit, also a free utility (runs on windows) | |
14:58 | lea | I used a similar python library |
14:58 | no, it was that our current lib system doesn't export subject tags (250 i think?) | |
14:58 | kados | 650 most likely |
14:58 | lea | ah yeah, 650 |
14:59 | kados | if you have ISBNs I can imagine writing a script to pull them off of LOC's Z-server ;-) |
14:59 | lea | it uses "keywords" which is stuck in some custom tag somewhere and is full of, well, crap really |
14:59 | yeah, I even did that! | |
14:59 | but a lot of the ISBNs didn't match | |
14:59 | if i remember right, a lot were 10 digit instead of 13? something like that | |
15:00 | i experimented with the titles and got some hits with different ISBNs | |
15:01 | kados | you could wrap it in a call to OCLC's xISBN service |
15:01 | lea | also, back when i was setting up a koha test server, there was no real guide on how to get it going and i found some things a little complicated. Like how it works :) |
15:01 | kados | to get the 13-digit ones too |
15:01 | lea | I've not heard of OCLC |
15:01 | kados | *cough* |
15:01 | lea | I'm in the uk. Is that a good enough excuse? |
15:01 | :) | |
15:02 | kados | hehe |
15:02 | oclc.org | |
15:02 | lea | yeah, found it |
15:03 | ok, if this channel is for devel/tech, may i pm you about this, as it's not directly related i guess? | |
15:11 | so, I'll try out koha v3. The interface of v2 was mega confusing to me. But then I'm not a librarian i guess | |
15:16 | hdl | kados : I sent some bugfix for NZ search on 12.12 But could not find it pushed. did you receive it ? |
15:24 | fbcit | g'morning koha |
15:24 | atz | hello fbcit |
15:25 | kados | hdl: hmmm, not yet |
15:25 | hdl | Did you have problems with it ? |
15:25 | kados | I didn't get it |
15:25 | maybe chris has it still? | |
15:57 | lea | should the README.txt indicate how to install zebra? |
16:01 | kados | hmmm |
16:01 | lea: what platform are you on? | |
16:02 | lea | linux (ubuntu 6.06) |
16:02 | kados | ahh |
16:02 | lea | ? |
16:02 | kados | there was a recent announcement on koha-devel about a new installer for Koha |
16:03 | that installer should be included in the main git.koha.org repo very soon, next few days most likely | |
16:03 | lea | is that different to the one that is in the current repo? |
16:04 | kados | yes |
16:04 | different in that it actually works :-) | |
16:05 | lea | well, i'm never one to complain ;) |
16:05 | kados | hehe |
16:05 | lea | is zebra a perl module? I'd really like to get it installed as "in the next few days" I wont be at work :) |
16:06 | kados | zebra is an application, if you're on ubunto the debian sources for it might work |
16:06 | indexdata.dk/zebra | |
16:07 | lea | ok, is there a "recommended platform". I'm happy to D/L another OS if that'll make it easier |
16:07 | kados | debian etch seems to be the most popular |
16:07 | and well tested | |
16:09 | lea | so, honestly, is it easy to install on etch? |
16:09 | 21 CDs o.O | |
16:11 | kados | ahh, use the business card install |
16:11 | 1 CD | |
16:11 | easy, I can't say, easier, yes | |
16:11 | we're working on making it easy :-) | |
16:14 | lea | ok got cd1 :) |
16:14 | I'll attempt to install | |
16:14 | fbcit | kados: got a sec? |
16:16 | gmcharlt: I worked on that script till 2am local... | |
16:17 | gmcharlt | fbcit: how'd it go? |
16:17 | fbcit | I cannot get dmake to correctly pass a system call in fix-perl-path.PL to do the mode change |
16:18 | the call works fine as a one liner via perl -e | |
16:18 | it works fine when you call the script as perl fix-perl-path.PL | |
16:18 | but chokes when called by dmake | |
16:18 | I wonder if it is an escaping issue? | |
16:19 | system qq|attrib -r $pathfile|; | |
16:20 | kados | fbcit: do now |
16:21 | fbcit | kados: I ran into a problem with some filenames and subdir names containing white spaces when adapting installer for Win32 |
16:22 | kados | really? I wasn't aware of any such files, we shoudl change them if they exist |
16:22 | fbcit | I have changed this so that the filenames/subdirs use a dash instead |
16:22 | kados | yea, good idea |
16:22 | fbcit | I have a patch ready |
16:22 | kados | ok |
16:22 | fbcit | but it might break some current installs |
16:22 | those that reference these files | |
16:22 | they are gif files in itemtypeimg | |
16:23 | so here comes the patch... | |
16:24 | kados | ahh, the images |
16:29 | gmcharlt | fbcit: maybe use Win32::File (and do something to make sure that module is required only on Windows -- perhaps put in a separte script?) |
16:59 | fbcit: just sent e-mail with example script; it seems Perl functions stat and chmod work OK on my copy of Strawberry running on XP pro | |
17:02 | fbcit | gmcharlt: great! maybe my strawberry install is behind... |
17:05 | gmcharlt | fbcit: if I haven't messed anything up in my example, then maybe all fix-perl-path.PL needs to to remove $^O check around the file permissions stuff |
17:05 | at least until somebody starts work on the VMS port of Koha ;-) | |
17:06 | fbcit | gmcharlt: you should have an email w/my current script... |
17:07 | system call is ~ line 87. | |
17:12 | gmcharlt: I'm stepping out to lunch... brb | |
17:13 | gmcharlt | fbcit: ok |
17:26 | lea | bah, by default debian installs postgres. Is that an issue? |
17:38 | gmcharlt | hi lea: it won't hurt; however, pg support is still experimental, so you should install the mysql-server-5.0 package via apt-get or aptitude |
17:41 | lea | done already - thanks :) |
17:45 | fbcit | gmcharlt: sorry, I misunderstood you earlier... you send me an example script... I'll check it out on XP later today/nigh |
17:45 | s/nigh/night/ | |
17:46 | gmcharlt | fbcit: ok; if it works OK, then system attrib may not be needed |
17:46 | fbcit: I took a look at your e-mail | |
17:46 | only reason I can think why the system attrib +/-r isn't working is maybe a different PATH is set in Makefile or used by dmake | |
17:47 | fbcit | I did notice that dmake has a per target option to cause dmake to move to a separate memory space prior to executing the target |
17:47 | gmcharlt | and there's some other executable attrib in the revised path |
17:47 | but if Perl stat/chmod works when you test tonight, hopefully won't need to worry about it | |
17:47 | fbcit | right. |
17:48 | I'm for the easiest out.... after ~6hrs of debug :-( | |
17:49 | gmcharlt | understood -- been though that kind of long debug session many times myself |
18:43 | lea | my life is a long debug session |
18:43 | chris | lol |
18:53 | lea | hmm.. I'm following the install guide to the letter and have an issue: the paths in the apache conf are wrong. I've updated document root but I can't seem to find the paths to cgi-bin |
18:58 | gmcharlt | lea: try /usr/lib/cgi-bin/koha/opac for OPAC cgi scripts, /usr/lib/cgi-bin/koha/ for staff interface |
18:58 | (should be improved when installer changes get merged in) | |
18:59 | lea | yeah thanks |
18:59 | just need to find koha-conf.xml now :) | |
19:03 | gmcharlt | /usr/share/koha/etc/koha-conf.xml (yes, things are currently rather scattered :) ) |
19:06 | lea | gah, i've lost my template directory now >.< |
19:07 | yay! | |
19:07 | fbcit | kados: try navigating to catalouging and then use the banner search to search for a term you know is not in any title in the db. |
19:08 | kados | k |
19:08 | o results found | |
19:08 | Biblios in reservoir | |
19:08 | None | |
19:08 | tried a search for 'fizbit' | |
19:09 | fbcit | hrmm.. |
19:09 | kados | what do you get? |
19:09 | fbcit | Can't use an undefined value as an ARRAY reference at /usr/lib/cgi-bin/kohaclone1/cataloguing/addbooks.pl line 81. |
19:09 | kados | hrmm.. |
19:10 | lea | I'll wait for the installer ;) Thanks all and good evening. |
19:10 | fbcit | cgi-bin/koha/cataloguing/addbooks.pl?q=fizbit |
19:10 | kados | my $total = scalar @$marcresults; |
19:10 | is line 81 | |
19:11 | fbcit | not sure what's happening... |
19:11 | kados | could you throw a warn in |
19:11 | use Data::Dumper; | |
19:11 | my ( $error, $marcresults ) = SimpleSearch($query); | |
19:11 | warn Dumper($marcresults); | |
19:11 | (around line 71) | |
19:12 | gmcharlt | fbcit: also warn ($error); |
19:13 | fbcit: should be undef if you get to line 81, but just in case | |
19:16 | fbcit | Dumper says: $VAR1 = undef; |
19:20 | gmcharlt | fbcit: are you running NoZebra? |
19:20 | fbcit | yes... very weird behaviour now... |
19:21 | NoZebra on this install, yes | |
19:22 | must be something, because once Dumper operates on @$marcresults, the script runs fine and returns a results not found page | |
19:23 | could it be NoZebra that causes @$marcresults to return undef? | |
19:24 | kados | that's likely |
19:24 | paul/hdl or mason should be able to fix tha tup | |
19:24 | fbcit: can you file a bug on that? | |
19:24 | fbcit: and I'd say that's a blocker since it causes a 500 error | |
19:25 | should also be pretty simple to fix | |
19:25 | fbcit | right... |
19:25 | kados | :) |
19:25 | fbcit | hehe |
19:25 | kados | in my defense, the searching with zebra is really top notch :-) |
19:26 | fbcit | but somebody has to find the bugs... ;-) |
19:26 | kados | :) |
19:27 | owen | Hi everybody |
19:27 | kados | hey owen |
19:27 | gmcharlt | hi owen |
19:27 | fbcit | hi owen |
19:29 | kados | owen: quick question for ya ... is there any reason to keep all of those images for itemtypes that have spaces in the name? |
19:29 | owen: can we just use the itemtypecode now for naming them? | |
19:30 | owen | We should be able to, although we haven't quite settled on a way to handle item type images, judging from the differences in the templates |
19:30 | gmcharlt | fbcit: I reproduced your bug, btw |
19:31 | owen | It's something that needs an 'audit' |
19:31 | kados | owen: yea, it exists at two levels now, depending on a global syspref |
19:31 | owen: either at the bib or item level | |
19:31 | yea, it does need an audit | |
19:48 | fbcit | hrmm... line 943 of Search.pm says: # IMO this subroutine is pretty messy still -- |
19:48 | owen | Yeah, that really should be changed to IMNSHO |
19:48 | ;) | |
19:49 | fbcit | I may have had a better understanding of it @ 2am this morning... :-) |
19:51 | gmcharlt | fbcit: midnight_oil++ |
19:51 | or 2am_oil++ | |
19:52 | fbcit | actuall after 12am its $time_oil-- |
20:06 | ok | |
20:06 | kados or gmcharlt | |
20:06 | gmcharlt | fbict: yep? |
20:07 | fbcit | looks like NZanalyse returns empty results, NZorder operates on empty results and passes all of this back to addbooks.pl which has no provision for handling empty results. |
20:07 | chris | i always read that as New Zealand analyse |
20:08 | gmcharlt | fbcit: looks like it should be return ref to an empty list instead of undef then |
20:08 | chris | yeah |
20:08 | fbcit | k |
20:09 | gmcharlt | also compare with how cataloguing/addbiblio.pl checks results of SimpleSearch |
20:14 | fbcit | gmcharlt: looks to me like addbiblio.pl handles the undef $result by else condition |
20:14 | shouldn't addbooks.pl take the same approach? | |
20:14 | rather than messing with Search.pm? | |
20:14 | gmcharlt | fbcit: that would be safer, I agree |
20:15 | fbcit | where is the code that handles the same case w/Zebra? |
20:17 | owen | Hi tim |
20:18 | tim | Hi owen |
20:18 | You're probably who can help me. | |
20:18 | Template problem. | |
20:18 | owen | Okay |
20:18 | gmcharlt | fbcit: SImpleSearch is (an) entry to both Zebra and NoZebra searches |
20:20 | tim | I'm adding clickable URLs to our OPAC. Took the code from the CSS template and got it working as good as it does in CSS. |
20:20 | But it only shows the first URL. | |
20:21 | It's in a TMPL_LOOP, but doesn't seem to loop through the URLs in 856 | |
20:22 | Still don't know much about templating to figure what's wrong. | |
20:23 | Looks like CSS is the only template that uses it. | |
20:23 | owen | tim, I suspect it's the script's fault, not the template |
20:23 | Youre talking about opac-detail, right? | |
20:24 | tim | Yes. I guess I should've mentioned that. |
20:25 | I'm as good with perl as I am with templates, but I'll look into it. | |
20:25 | owen | And it's <!-- TMPL_LOOP name="URLS" --> ? |
20:25 | tim | Yup |
20:26 | fbcit | duh... |
20:26 | s/$results/\@$results/g @ line 206 in Search.pm and things work fine. | |
20:29 | sloppy syntax... | |
20:35 | gmcharlt | fbcit: you mean my @$result = NZorder(NZanalyse($query))->{'biblioserver'}->{'RECORDS'}; |
20:35 | ? | |
20:37 | fbcit | one moment... phone |
20:48 | gmcharlt: no, I mean return (undef,\@$result); | |
20:48 | $results appears to be an unnamed array. | |
20:50 | I want to pass back a ref to it just as the Zebra version of SimpleSearch does. | |
20:50 | gmcharlt | fbcit: NZOrder returns a hashref |
20:51 | if there are hits, the hashref has 'hits' and a 'biblioserver' key | |
20:51 | 'biblioserver' contains hashref that contains 'RECORDS' key, which is an arrayref of biblionumbers | |
20:52 | problem is that if there are no hits, 'hits' key exists but rest doesn't | |
20:52 | so I'm inclined to say that the fix should: | |
20:52 | 1. not stack the calls to NZOrder and NZanalyze | |
20:52 | 2. store return of NZOrder in a separate var (e.g., $nz_results) | |
20:52 | 3. check $nz_results->{'hits'} | |
20:53 | if exists and > 0, set $results = $nz_results->{'biblioserver'}->{'RECORDS'} | |
20:53 | if == 0, set $results = [] | |
20:54 | 4. ponder that one can occassionally be *too* clever with Perl expressions | |
20:55 | alternatively, change NZOrder so that {'biblioserver'}->{'RECORDS'} always exists in return | |
20:56 | kados | someone really needs to clean up the variable names for all the nozebra code |
20:57 | there's some serious missunderstanding of the terms operator, operand, parameter, index, etc. | |
20:58 | tim | Ok. I could be looking in the wrong direction and sure don't know how to fix it anyway. |
20:58 | I think I could've found a problem in C4/Search.pm that keeps the URLs from listing. | |
20:59 | At least with our records. | |
20:59 | It looks like it's splitting separate URLs from a string with a pipe '|' between URLs | |
21:00 | Ours each have their own 856u tags | |
21:00 | kados | *cough* |
21:01 | tim: do you have a sub called something like GetMarcUrls ? | |
21:05 | tim | Didn't find it. |
21:08 | kados | looks like you've got getMARCnotes and getMARCsubjects |
21:08 | =head2 my $marcurlsarray = &getMARCurls($dbh,$bibid,$marcflavour); | |
21:08 | :q | |
21:09 | doesn't look like the actual function is anywhere in your installation | |
21:09 | but basically, that's the function you're looking for | |
21:09 | :) | |
21:10 | tim | Is it a problem adding it? |
21:19 | kados | shouldn't be |
21:19 | just need to find it ... should be in the latest rel_2_2 repo | |
21:19 | grep -r getMARCurls * | |
21:22 | fbcit | gmcharlt: to further obfuscate... ( $result->{hits} && $result->{hits} > 0 ) ? $result : $result = []; |
21:23 | fixes the problem | |
21:23 | correctly hopefully | |
21:24 | tim | Just get the latest SearchMarc.pm ? |
21:24 | fbcit | btw, I left the stacked calls in place. |
21:24 | kados | tim: I'm not sure if it's a drop in replacement |
21:24 | tim: but you could try ... just make sure to keep a back-up copy | |
21:24 | tim: of the current one | |
21:24 | gmcharlt | fbcit: yep, but perhaps a little too obscure: I suggest at least assign the output of the ? : to a different var, then return it |
21:25 | i.e., my $search_results = ( foo ? bar : [] ); | |
21:27 | fbcit | done and done. |
21:27 | my $search_result = ( $result->{hits} && $result->{hits} > 0 ? $result : [] ); | |
21:27 | return (undef,$search_result); | |
21:27 | tim | I just tried it on the virtual machine I downloaded. I think it's 2.2.9 and it hast the getMARCurls and still doesn't work. |
21:27 | kados | hmmm |
21:28 | tim | Just displays the first URL |
21:28 | kados | must not be using getMARCurls then |
21:29 | fbcit | && there goes the patch. |
21:29 | gmcharlt | fbcit: my $search_result = ( $result->{hits} && $result->{hits} > 0 ? $result->{'biblioserver'}->{'RECORD'} : [] ); # perhaps? |
21:29 | kados | tim: sorry I'm not being very helpful :-) |
21:30 | gmcharlt | i.e., does searching on a keyword that *is* present work? |
21:30 | tim | No problem. I'd like to be half as helpful. |
21:31 | fbcit | gmcharlt: arg! |
21:31 | gmcharlt | fbcit: no, parameter! |
21:32 | fbcit | not quite: Can't coerce array into hash at /usr/share/kohaclone1/C4/Search.pm line 207. |
21:34 | gmcharlt | fbcit: did you change preceding line to be just my $result = NZorder(NZanalyse($query)); |
21:34 | fbcit | my $result = NZorder(NZanalyse($query))->{'biblioserver'}->{'RECORDS'}; |
21:34 | my $search_result = ( $result->{hits} && $result->{hits} > 0 ? $result : [] ); | |
21:34 | return (undef,$search_result); | |
21:36 | my $result = NZorder(NZanalyse($query)); #->{'biblioserver'}->{'RECORDS'}; | |
21:36 | my $search_result = ( $result->{hits} && $result->{hits} > 0 ? $result : [] ); | |
21:36 | return (undef,$search_result); | |
21:38 | gmcharlt: the search on a valid term breaks badly w/ the latter 3 lines... | |
21:40 | gmcharlt | fbcit: try this: |
21:40 | my $result = NZorder(NZanalyse($query))->{'biblioserver'}; | |
21:40 | my $search_result = ($result->{hits} && $result->{hits} > 0) ? $result->{'RECORDS'} : []; | |
21:40 | return (undef,$search_result); | |
21:40 | fbcit | it returns titles in the biblio reservoir, but not in the catalog |
21:40 | gmcharlt | my bad re strucure of returned hash |
21:42 | fbcit | no good |
21:42 | now addbooks.pl complains: Not an ARRAY reference at /usr/lib/cgi-bin/kohaclone1/cataloguing/addbooks.pl line 81 | |
21:43 | gmcharlt | hmm -- works for me -- is your addbooks.pl back in sync with HEAD? |
21:45 | fbcit | git diff HEAD shows only Search.pm as changed |
21:46 | my $result = NZorder(NZanalyse($query))->{'biblioserver'}; | |
21:46 | my $search_result = ( $result->{hits} && $result->{hits} > 0 ? $result : [] ); | |
21:46 | return (undef,$search_result); | |
21:46 | ahh... | |
21:46 | gmcharlt | need ->{'RECORDS'} in the ternay |
21:46 | ternary, rather | |
21:47 | fbcit | right, works now. |
21:47 | both ways | |
21:49 | gmcharlt | just checked other intranet catalogue search and OPAC -- don't seem to have broken them with this change |
21:50 | NZorder is suffering from a bad case of copy-and-pasteitis | |
21:53 | fbcit | gmcharlt: sent a corrected patch off, though its mostly your work. |
21:53 | gmcharlt | fbcit: thanks |
21:54 | fbcit | I'm off. I'll try that script on XP tonight. |
21:54 | gmcharlt | ok |
21:56 | fbcit-away | btw kados, I ordered a metrologic scanner today. |
21:57 | gmcharlt | fbcit-away: fyi, kados has done a merge of the installer tree into main |
21:58 | fbcit-away | great |
22:31 | kados | fbcit-away: cool |
22:31 | fbcit-away: did you get it from posguys? :-) | |
23:21 | fbcit | gmcharlt: have you run fix-perl-path.PL from a command line against blib? |
23:28 | masonj: did you get a chance to look at the position of the barcodes on labes? | |
23:28 | masonj | hiya fbcit |
23:29 | fbcit | s/labes/labels/ |
23:29 | masonj | no i didnt , things are pretty busy at the moment, with urgent things |
23:30 | fbcit | np |
23:30 | masonj | yep, will do soon tho |
23:31 | fbcit | I'll look later this week too. |
23:31 | tnx | |
23:31 | masonj | np :) |
23:45 | gmcharlt | fbcit: just tried it -- seem to work ok |
23:45 | test was on Debian, btw | |
23:49 | fbcit | maybe it's my machine... |
23:49 | try it on Win32 if you can | |
23:50 | I have to go, but if you can try it, let me know what happens. | |
00:09 | masonj | hdl, u about on irc? |
00:11 | must be late in france now... | |
05:16 | [K] | *** part FreeNode!#koha: rangi n=chris203-118-134-114.netspace.net.nz |
05:16 | *** join #kohaFreeNode: rangi n=chris203-118-134-114.netspace.net.nz | |
07:56 | hdl | masonj: ? |
08:12 | masonj | hi hdl |
08:13 | hdl | hi |
08:13 | late for you. | |
08:13 | I sent a patch on 12.12 that fixed your problem. | |
08:13 | Indeed -X was a way to weight values. | |
08:13 | masonj | ah, i couldnt quite find it |
08:14 | hdl | But when doing an AND it failed. |
08:14 | masonj | could find your patch... |
08:14 | hdl | I think kados hasnot pushed it yet. |
08:14 | masonj | s/could/could not/ |
08:14 | aaah, right | |
08:15 | hdl | maybe i can send it to you directly |
08:15 | masonj | cool hdl, thanks for that |
08:15 | yeah, yes please | |
08:16 | take a look at bug 1677... | |
08:17 | http://bugs.koha.org/cgi-bin/b[…]w_bug.cgi?id=1677 | |
08:18 | hdl | my patch strips weights value. |
08:18 | masonj | right :), |
08:19 | so , a change to rebuild_nozebra.pl not needed? | |
08:19 | hdl | no. |
08:20 | masonj | ok, ill try that out |
← Previous day | Today | Next day → | Search | Index