IRC log for #koha, 2006-07-26

← Previous day | Today | Next day → | Search | Index

All times shown according to UTC.

Time Nick Message
12:58 kados http://www.oclc.org/research/projects/fast/
13:02 tumer[A] kados:check mail. Was it clear enough?
13:28 hdl kados around?
13:34 kados hdl: yep
13:34 hdl: what's up?/
13:34 hdl I have a problem
13:34 MacOSX perl Module installation
13:34 It seems It cannot launch make
13:35 Do you know how to cope with it ?
13:36 kados hmmm
13:36 thd hdl: what version of OSX?
13:37 kados hdl: what's the exact error?
13:37 dewey i guess the exact error is just before :
13:37 hdl kados : make command not found.
13:37 kados hdl: and what module?
13:38 hdl:  it sounds like you need to install the developer's tools
13:38 hdl:  on the original CD
13:38 hdl: Xcode tools
13:38 hdl The first I tried was MARC::Record.
13:38 thd hd: you have to have the developer's tools disc installed, well kados types faster
13:39 hdl But is there a way to install perl Module without building them ?
13:39 kados hdl: yes
13:39 hdl: look in the lib dir
13:39 hdl I always use perl -MCPAN -e install ?
13:39 kados hdl: and copy it into your perl path
13:39 all make does is run the tests
13:39 and copy things over to the right spots
13:39 so you can do it manually
13:39 it's just a lot of work
13:40 hdl: install Xcode tools and it should work properly
13:40 hdl Holy ...
13:40 It is on a machine in switzerland.
13:40 kados hehe
13:40 hdl via ssh.
13:40 kados hmmm
13:40 that is a problem :-)
13:41 hdl :D
13:41 thd hdl: the incompatibilities of building things on OSX are also a lot of work
13:41 kados there is no way to install remotely that I know of
13:41 thd kados: I suspect there is a way
13:41 kados hdl: one way would be via VNC
13:41 thd hdl: is this the server version of the software?
13:42 hdl I think it is MacOSX Server Edition.
13:42 kados hdl: http://tomclegg.net/xcode-remote-install
13:42 thd hdl: in any case, you can download the developer's tools
13:42 kados hdl: I'm not responsible if it breaks their system :-)
13:43 hdl: so according to that guy you can do it
13:43 thd hdl: you should also install fink which has ports from Debian so that you can install things correctly
13:44 hdl: you need developer's tools first
13:45 hdl: you have to match versions correctly so be certain you know which animal panther, cheetah, etc you have OS 10.?
13:46 hdl kados : Is there any tricks I should be aware of before installing ?
13:46 For instance : apache is named httpd
13:46 kados i only run OSX as a desktop
13:46 don't have KOha installed
13:47 I use debian sarge for all koha installations
13:47 thd hdl: you should get a copy of "OSX for Unix Geeks"
13:47 hdl services are launched with service httpd restart
13:47 thd hdl: I successfully installed Koha 2.2 on OSX.
13:48 shedges thd:  did your install include the Z3950 search?
13:49 thd shedges: I wasted months trying to find an alternative  a year ago
13:50 shedges me, too.  I thought maybe somebody had solved the problem.
13:50 thd shedges: so no, I would have needed to recompile Apache and that might have broken other things so I refused to do more work on OSX after a certain point.
13:52 shedges, hdl, kados: Apple made OSX sufficiently different from FreeBSD, that doing work on the command line in the proper Unix way was a recurring exercise in frustration for minor incompatibilities
13:54 hdl: I did get things to work but I had to Google to workaround errors too often for installing some things from source
13:54 hdl: Koha should not be a major problem
13:54 hdl thd: Do you have some guidelines for Koha Installations ?
13:55 thd hdl: It was too long ago to remember except that you need to install the developer tools and I would recommend installing Fink.
13:57 hdl: I remember some problems with the directory where OSX puts man pages
13:58 hdl: Google searches on site:macosxhints.com are helpful
14:01 hdl: often you may find that you want to install things with a different version than what apple provides fink has a separate non-conflicting directory for doing just that
14:07 hdl: I really needed the OSX for Unix Geeks book to find my way around how Apple renamed everything etc.  The book also has some undocumented commands
14:12 kados: where is tumer's message?
14:14 hdl thd: Many thx.
14:16 thd hdl: you are quite welcome.  OSX is great for the GUI but is not what you want to install Unix software on if you had a choice.
17:43 kados: are you there?
18:05 tumer hello
18:05 tumer: what was the message that you sent?
18:05 tumer hi i am still strugling
18:06 oh i sent a report to ID about the issue we discussed
18:06 thd tumer: with what are you still struggling?
18:06 tumer: kados thought of a way around the problem for some cases
18:07 tumer like ???
18:09 thd tumer: you could put record IDs which needed matching into some local use filed and index on the local use field which would then be the same field for bibliographic and holdings records.
18:09 s/filed/field/
18:10 tumer thd: i think i could not let myself be understood very well. give me a mail address and i will send you an email
18:11 thd tumer: you could then have 99X for the bibliographic ID in both bibliographic and authority records
18:12 s/authority/holdings/
18:12 tumer authority and bibliographic works fine
18:13 thd tumer: try koha at agogme.com
18:14 tumer well it is the authority records about which I have the greatest concern
18:15 tumer agogme.com what do i do there?
18:15 thd tumer: I want to be able to search the references and tracings for multiple authority records and return bibliographic records
18:15 tumer did you not want an email address?
18:16 tumer yes your email please
18:16 thd tumer: koha at agogme.com
18:16 tumer oops sorry
18:19 thd:koha-dvel already does that
18:20 we do the similar thing with items as well
18:20 search biblios and retrive related items -- no problem
18:20 or vice versa
18:21 but SQL like JOIN is what we are after
18:22 i want to be able to limit the search to 120,000 records out of 200,000 depending which criteria i put in like a branch
18:27 thd tumer: yes, had you posted to koha-devel?
18:28 tumer not yet
18:30 thd oh I have the message now
18:32 tumer: how did you implement the recursive search of IDs?  What was recursive about your search?
18:34 tumer well search author and get 1000 records search branch main get 120,000 records then do a serach of 1000 biblionumbers within 120,000
18:34 result gives yo 121 records
18:35 thd tumer I guess it is recursive if each record must search for its own ID again
18:35 tumer yes you do 1000 times
18:37 thd tumer: do you have problems with one database failing to respond if you search multiple databases in the same domain?
18:38 s/domain/server/
18:38 tumer no but try 1000 searches of biblionumbers one after the other and everything is standstill
18:39 thd tumer: will you not have the same issue when you have 1000 simultaneous users? :)
18:40 tumer not really, scripts work faster then fingers
18:41 thd tumer: so 1000 simultaneous users is really only ten simultaneous users?
18:42 tumer and they only retrieve some records while i have to retrive all 1000 extract biblionumbers then do a search for each get 120 results
18:48 thd tumer: I hope Index data supports record linking.  I think they must because they helped with a demo system that must have had linked record indexes.
18:49 tumer demo where?
18:50 can we look at it?
18:50 thd tumer: it has not been working for a few weeks and it is years old
18:51 tumer: I think Index Data forgot about it
18:51 tumer well lets hope and wait
18:53 thd tumer: maybe they never ran that whole system but they had won the contract to help build it
18:55 kados: are you there?
18:55 kados: FAST is not a solution for records with LCSH
18:57 kados: FAST was Chan's idea from a decade ago to simplify LCSH for the era when no one knew how to catalogue anymore
02:06 osmoze hello #koha
02:11 btoumi hi all
02:19 osmoze bonjour btoumi
02:19 btoumi bonjour osmoze
02:19 comment va ce matin?
02:22 osmoze ca va ca va, comme un lundi :) et toi ?
02:26 btoumi osmoze : on est mardi??? ca devait etre une blague :=)
02:27 osmoze ah oui, mais non c est pas une blague, je bosse le samedi donc votre mardi c est mon lundi :)
02:27 btoumi lol
02:27 ok
02:27 j'ai travailler pendant cinq ans le samedi donc je te comprend
02:29 osmoze ^^
02:29 btoumi osmoze : et ton travail d'investigation sur koha avance?
02:29 osmoze c est les joies de travailler en bib de lecture publique :/
02:29 btoumi, d investigation de ?
02:29 btoumi sur koha =>test ect.
02:31 osmoze non pas en ce moment, je suis en pleine reinstallation du parc des pc publics et interne. Le grand ménage d été. ALors en ce moment, koha n'est pas la priorité ^^
02:33 d'ailleurs, je dois me rendre dans un autre centre, @ plus tard
02:34 btoumi ah ok
02:34 paul tada ... my internet connexion works !
02:34 btoumi yesssssssssssss!
02:34 bonne nouvelle pour Paul
02:47 qiqo ei help..
02:48 paul toins : tu peux réintégrer le bureau quand tu veux ;-)
02:48 qiqo bonjour paul
02:48 tu peux m'aide?
02:48 toins paul, Ah trop bien !!!
02:49 qiqo im getting this error message while updating koha to 2.3.0 not well-formed (invalid token) at line 2, column 8, byte 9 at /usr/lib/perl5/site_perl/5.8.6/i486-linux/XML/Parser.pm line 187
02:49 from 2.2.5 to 2.3.0
02:49 toins paul, la je n'ai pas de voiture dans l'immediat... donc cet après midi
02:50 chris theres about 0% chance that an upgrade from 2.2.5 to 2.3.0 will work
02:50 qiqo ah really?
02:50 chris just installing 2.3.0 and getting it to work is very hard
02:50 qiqo why is that so chris?
02:50 chris yes, 2.3.0 is purely a development release
02:52 http://savannah.nongnu.org/for[…]php?forum_id=4530
02:52 qiqo :(
02:52 chris tells you a little bit more about it
02:52 qiqo ah aryt i understand
02:52 chris if you actually want to use it, id wait for a 2.4.x release
02:53 if the second number is an odd number
02:53 then its unstable/development and only *might* work
02:53 qiqo ermm ok so i have another question... how do i reset the Z3950 module
02:53 chris if its even, like 2.2 or 2.4 ... then it should work
02:53 qiqo ahh ok now i understand
02:54 chris to go back down to a previous version? hmm not sure about that
02:54 i think just get the older version and install that
02:54 but im not sure about that
02:54 qiqo yup im doing that as of the moment
02:55 i run koha.upgrade or 2.2.5
02:55 ok im back with 2.2.5
02:57 my Z3950 module isnt working
02:58 chris did you upgrade it when you were trying to upgrade to 2.3 ?
02:59 qiqo nope, the upgrade never finalised
03:00 hdl hello all
03:00 qiqo hello hdl
03:00 chris was the z3950 working before you started upgrading?
03:00 hi hdl
03:00 qiqo nope it wasnt.. actually i just finished installing 2.2.5 a while ago
03:00 then i attempted an upgrade
03:00 chris its a bit tricky to get going
03:01 the main thing to check is do you have z3950 daemon running?
03:01 qiqo according to the mailing list, after adding a z3950 server i have to restart the daemon
03:02 chris maybe, but i dont think so, you do need to have the daemon running though
03:02 it will log
03:02 to /usr/local/koha/log/
03:02 so you can see what its doing
03:03 qiqo i think it's not running
03:03 my query is included in the koha-errorlog
03:04 chris z3950-daemon-launch.sh
03:04 is the script to start it
03:04 in /usr/local/koha/intranet/scripts/z3950daemon
03:04 qiqo yup
03:05 how do you know if its loaded?
03:06 chris what happens when you type
03:06 ps axf | grep "z3950"
03:06 should get something like
03:06 1999 pts/17   S      0:00 su -c /usr/local/koha/intranet/scripts/​z3950daemon/z3950-daemon-shell.sh - www-data
03:06 2000 pts/17   S      0:04  \_ /usr/bin/perl /usr/local/koha/intranet/script​s/z3950daemon/processz3950queue /usr/local/koha/log
03:06 qiqo 4904 pts/1    S+     0:00  |   |   \_ grep z3950
03:06 chris right its not running then
03:07 qiqo errmm..
03:07 chris in /usr/local/koha/log
03:07 are there any files like
03:07 z3950-daemon-20060725-2005.log
03:07 qiqo yup
03:07 there is one
03:07 chris todays date?
03:07 qiqo yup
03:08 chris what does it say?
03:08 qiqo Bareword "Net::Z3950::RecordSyntax::USMARC" not allowed while "strict subs" in use at /usr/local/koha/intranet/script​s/z3950daemon/processz3950queue line 260.
03:08 chris there we go, thats why its not starting
03:09 2 secs i think i remember the fix for this
03:09 qiqo line 260 says: eval { $conn->option(preferredRecordSyntax =>
03:10 i think the problem is betweenMARC21 andUS marc
03:10 are MARC21 and USMARC alike? because my teacher in library science told me that they are somewhat different
03:10 chris you could try commenting that line out
03:11 qiqo oh its running
03:13 ohh nope false alarm
03:13 ehehe its still is not running
03:13 chris anything else in the error log now/
03:13 qiqo Bareword "Net::Z3950::RecordSyntax::UNIMARC" not allowed while "strict subs" in use at /usr/local/koha/intranet/script​s/z3950daemon/processz3950queue line 261.
03:14 ill have the two commented out
03:14 then ill try again
03:14 chris right
03:16 actually
03:16 qiqo heres what i get in ps: 4950 pts/1    S      0:00 /usr/bin/perl /usr/local/koha/intranet/script​s/z3950daemon/processz3950queue /usr/local/koha/log
03:17 chris try making it $Net::Z3950 ....
03:17 ie add the $
03:17 that looks like its running
03:18 qiqo hurrah
03:19 still im not getting results from the library of congress
03:19 :(
03:20 chris whats the log tell you
03:20 loc doesnt have the most reliable server
03:21 you could try monash
03:21 qiqo what servers can i use?
03:21 the logs are blank
03:21 chris zconn.lib.monash.edu.au
03:21 7090
03:21 did you do an isbn search or a title one?
03:22 db is voyager for monash
03:22 qiqo yup
03:22 chris which one isbn?
03:22 qiqo harry potter 1
03:22 the isbn of hp
03:23 then i tried harry potter
03:23 chris the little popup windows comes up, but never gets any results?
03:25 qiqo none
03:25 no results found
03:25 and i dont get any popup windo
03:25 chris umm you should thats were the results show up in
03:26 so you go to add biblio, choose add new biblio?
03:26 qiqo no popups :(
03:26 yeah
03:27 chris you end up with at a page with a bunch of fields
03:27 qiqo yeah marc inputs
03:27 chris where you can enter marc into eh?
03:27 qiqo tags etc..
03:27 yup
03:27 chris and you entered stuff in the isbn one?
03:27 and hit z3950 search?
03:27 qiqo errmm.. y ou are actually adding a new book then right?
03:28 i need the Z3950 feature...
03:28 chris no
03:28 thats how the z3950 works
03:29 there is a z3950 search button on that page
03:29 qiqo ahhh
03:29 chris that should open a popup and start searching servers
03:29 qiqo yeah
03:30 still no results found
03:30 chris if you are lucky you get some results that you can then click on the record, and it will populate your table
03:30 does it say ?? requests to go
03:30 and does the log say anything now
03:31 qiqo zconn.lib.monash.edu.au
03:31 ah sorry
03:31 Still ?? requests to go <--i get this
03:31 chris right
03:31 i get stuff like this in my log
03:32 @attr 1=4 "freakonomics" at /usr/local/koha/intranet/script​s/z3950daemon/processz3950queue line 279.    
03:32 2007/86 : Processing title=freakonomics at MONASH zconn.lib.monash.edu.au:7090 voyager MARC21 (1 forks)
03:32 qiqo i only get blank logs
03:33 0B in size
03:33 chris hmm, well i gotta go sorry. Hopefully some of the other ppl can help
03:34 qiqo aww..
03:34 huhu :'(
03:34 chris 8.30 here, and i should spend some time with my wife .. before i get banned from the computer
03:34 :)
03:34 qiqo heheh
03:34 alright
03:34 thank you very much sir
03:35 these would help
03:35 chris no problem, good luck with it
03:40 qiqo hayz...
03:53 hmmm
04:03 ei i have a question, in 2.2.5 do we still need the API2:PDF v 0.3r77??
04:04 paul yep qiqo
04:04 qiqo or any version of API2::PFF will do?
04:04 paul no
04:04 qiqo errm.. does perl stil host the file?
04:05 ok ill start googling
04:05 hehe :)
04:12 hi ozmoze!!
04:12 oh hes gone
04:39 paul toins_: tu as vu http://www.silicon.fr/articles[…]-ADSL-Orange.html
04:40 toins_ paul: je regarde
04:40 ah oui... nous l'avons vécu cette pane !
04:41 paul ca confirme que c'était bien général et coté wanamou
04:43 toins_ yep
07:14 paul toins_: are u around ?
07:15 toins_ yep
07:15 i'm here
08:03 paul hdl around ?
08:52 hello agains, everybody
08:52 is kados around or still away ?
08:52 tumer hi paul
08:52 paul hi tumer
08:52 dewey i heard hi tumer was still strugling
08:53 tumer yes dewey and still
08:53 paul aren't you too disappointed by ID answer (about merging data results)
08:53 tumer very much
08:53 so we have to keep holdings data in biblios
08:54 my mistake was i misssed this and finished the writing of whole new API for it ready to go
08:55 paul :-(
08:55 sorry for you
08:56 tumer i wish i knew some xml xsl and xslt
09:34 slef Hello all!
09:34 paul hi slef
09:35 slef Finally I stop doing VOIP and VPN and can get back on web sites more-or-less full-time.
09:36 Soon I will need to update my Koha installation to be useful again.  Should I aim for rel_2_6 or HEAD?
09:37 paul good question. depends on what you plan to do.
09:38 head and/or dev_week are definetly for w@rl0rds developpers.
09:38 slef priorities: get a working koha, fix the installer
09:39 (as in fix the new installer)
09:39 paul so, rel_2_2
09:47 kados paul: I've replied to your email
09:47 hi all
09:47 paul kados, yes, i've seen & read the mail.
09:47 I had a question for you, about your recent commit :
09:47 kados paul: http://wiki.liblime.com/doku.php?id=koha226bugs
09:47 paul it's about addbiblio.pl, once again.
09:48 kados paul: don't know if you've seen this report or not
09:48 paul it don't work anymore, with NPL or default templates.
09:48 kados yep
09:48 addbiblio--
09:48 paul with default, I have an empty screen & with NPL, authority report don't work anymore.
09:48 addbibio-- ???
09:49 kados I'm very sorry
09:49 it's never worked correctly for MARC21 records
09:49 paul do you mean MARC21 or npl templates ?
09:49 kados but I seem to have broken your stuff in my attempt to fix
09:49 MARC21
09:50 paul imho, addbiblio is a proof that we must use only 1 set of templates in Koha.
09:50 kados yep
09:50 paul could you explain your problems ?
09:50 I could revert your commit & try to fix them myself
09:51 kados you can revert my latest commit
09:51 it was  mistake
09:51 a mistake even
09:51 I can show you the problems when you do revert it
09:52 paul OK, i'll revert immediatly, let me 10mn to finish what i'm working on, and then I revert.
09:52 kados k, ping me when done
09:52 paul + look at my today commits, I think i've solved some/few acquisition problems you reported on koha226bugs
09:52 (the receive one at least)
10:09 owen kados, you around?
10:12 kados owen: sure am
10:12 owen: glad to see you back :-)
10:12 owen I came in to find a note saying the internet was up and down all day yesterday, so who knows how long I'll be back
10:12 kados owen: just going through our todo list
10:13 owen: 100 is still in sync with dev_week
10:13 owen zoomopac?
10:13 dewey zoomopac is probably stock dev-week with only minor changes to searching soon to be committed
10:13 kados owen: yea
10:13 owen: just added isbn search
10:15 owen The "most recent additions" search doesn't seem to be working
10:15 kados hmmm ...
10:16 they might not all work
10:16 DVD does though
10:16 and by work, I mean that it relies on what's in the record
10:16 and in some cases, the date is listed as 2099-x-x
10:16 more preprocessing for me to do and it's on my list already
10:17 eg: http://zoomopac.liblime.com/cg[…]ail.pl?bib=139954
10:17 date acquired is: 202004-02-04
10:17 which is obviouly more recent than anything with 2006- :-)
10:17 slef Got a strange problem with z39.50 search in a 2.2.5 system - everything is returning "Nothing found" even if the daemon finds it.  I'll read the debug log once I return in a bit, but any tips welcome.
10:18 kados slef: using Net::Z3950? or Net::Z3950::ZOOM?
10:18 slef owen: hi, by the way.
10:18 kados: whatever's default.  Net::Z3950?
10:19 kados slef: your best bet on that is upgrading to Net::Z3950::ZOOM and grabbing the latest code from rel_2_2
10:19 slef: troubleshooting the 2.2.5 version has been known to drive one mad
10:21 owen: the queries on all those are correct
10:21 owen: so if they don't work as expected, look to the record
10:21 well, it's a problem we can fix :-)
10:22 why call number sorting isn't working has got me stumped
10:23 owen: 'fix the “bold_titleâ€Â? in search results page'
10:23 owen: is that where the search query used to be hilighted?
10:23 owen Yeah, that was a long time ago.
10:24 (in dev_week terms)
10:24 kados owen: well, I can add it back, just tell me what you want the term to be wrapped in
10:24 owen: a span?
10:24 with a specific clas?
10:24 class even
10:24 <span class="term"></span> maybe?
10:25 owen Sure, why not
10:26 kados k ...
10:28 owen So kados bring me up to date on what you've been working on.
10:29 kados sure
10:29 mainly, over the weekend, I worked on the OPAC
10:30 my goal was to clean up all the OPAC scripts and remove all redundent/obsolete data
10:30 and fulfil some of the usability feedback I've gotten from various places
10:31 owen Like what?
10:31 kados well, I've gotten numerous comments that all account-related stuff should be in one place
10:31 or at least in one area
10:32 several people didn't understand why the book bag link was above search results
10:32 they expected it to be in the 'account' section
10:33 several clients also didn't understand why there were so many of the same link on a given page
10:33 the 'Search Home' link on http://search.athenscounty.lib.oh.us/
10:34 owen Book bags aren't tied to accounts
10:34 kados is also called 'library catalog'
10:34 in the opacnav
10:34 which several people found confusing
10:34 I agree book bags aren't tied to accounts
10:35 but conceptually, that's where people seem to want them
10:35 at least based on the feedback I solicited
10:35 owen It's getting into shaky territory, because we can't lead people to believe that the book bag is saved with their account settings.
10:35 kados I cleaned up all the opac scripts to use the new API (hopefully didn't miss anything)
10:36 added a resident search to the masthead that should show up on all screens
10:36 modified how the searchdesc displays
10:37 'search' returned X results is always resident but not a large thematic element
10:37 added a rss feed icon
10:37 owen How does that work?
10:37 kados floated the re-sort list to the right to free up space
10:37 the rss relies on the OpenSearch plugin I haven't committed yet
10:38 so the feed is actually generated from amazon.com (though we can generate feeds natively ... I just didn't get around to it)
10:38 s/amazon.com/a9.com/
10:38 lets see ...
10:38 opac-main has had a complete facelift
10:39 in preperation for a couple features I've been working on
10:39 they are:
10:39 owen opac-main seems to be empty!
10:39 kados yep
10:39 1. 4-5 items related to items you've previously checked out
10:40 2. 4-5 items pulled from a given staff list (virtual shelf)
10:40 3. 4-5 items recently returned
10:40 so the idea was, make the main page more interactive
10:41 if they need to do a search, they can do it from any page
10:41 advanced search is always visible (in opacnav)
10:41 and then there's the new facets feature
10:42 owen Is that in an iframe?
10:42 kados (before I get to that ...
10:42 no iframe
10:42 owen Just overflow?
10:42 kados yea, it's overflow
10:43 the content is in javascript
10:43 yep, you'll get that
10:43 I turned on overflow: auto to enable that
10:43 because sometimes the list goes beyond the space designated
10:45 all of my design choices were made from looking at the following sites:
10:46 http://aqua.queenslibrary.org/
10:46 http://www.amazon.com
10:46 http://firstsearch.oclc.org
10:46 http://search.ebay.com
10:46 http://www.lib.ncsu.edu/
10:47 as well as the 'don't make me think' book
10:48 as far as facets go
10:49 paul kados, could you explain me what "facets" means pls ?
10:49 kados paul: if you do a search, the faceted results are those listed on the left-hand side under 'Subject' Authors' 'series'
10:50 paul: it is a compilation of all of a given aspect of the results
10:50 paul: such as subject, author, etc.
10:50 owen: so facets ...
10:51 will be completely re-written from a functional POV
10:51 probably to follow the FAST guidelines: http://www.oclc.org/research/projects/fast/
10:51 as far as display, I plan to only show the first 3-5
10:52 owen 3-5 what?
10:52 dewey -2
10:52 kados and hide the rest within a final one called 'see X more'
10:52 owen Thanks dewey!
10:52 kados 3-5 of each type of facet
10:52 so for instance, a search on neal stephenson
10:52 has:
10:52 Subjects
10:52 dewey Subjects are authority controlled
10:52 kados Scientists
10:52 Treasure trove
10:53 Kings and rules
10:53 See 25 more
10:53 Series
10:53 The Baroque cycle
10:53 Volume two of The Baroque cycle
10:53 A Bandam spectra book.
10:53 Authors
10:53 Stephenson, Neal
10:54 that way, the user won't have to scroll down to see that there are multiple types of facets
10:54 the final thing I did was the 'remove search' and 'further limit search to'
10:54 and the 'default facets'
10:55 default facets show up when there hasn't been a search
10:55 the others allow refining the current search to include additional characteristics
10:55 well ... 'Further limit' does
10:56 (and only works with CCL queries at the moment)
10:56 well ... that's enough to fill a book :-)
10:57 owen: comments, questions?
10:57 paul kados : addbiblio.pl reverted. http://i5.bureau.paulpoulain.c[…]mple/addbiblio.pl works (login test/test)
10:57 kados paul: did you change templates too?
10:57 paul: or are they stock cvs?
10:57 paul mmm... the default templates are OK
10:57 npl ones are not.
10:58 kados I suspect MARC21 doesn't work in default templates
10:58 I will test that first
10:58 using default
10:58 paul but I haven't MARC21 DB
10:58 (if you have a small one, throw me a link to DL it)
10:58 kados ok
10:59 owen How is 'remove search' supposed to work?
10:59 kados owen: you click on it and it takes you back to the advanced search page :-)
10:59 owen: it's pretty minimal at the moment
11:00 owen: but eventually I'd like it to be a search history
11:00 owen: I'm not 100% sure that the js hierarchy is the best way to do that
11:00 owen: but it was pretty quick for a prototype mechanism
11:02 paul kados : http://i8.bureau.paulpoulain.c[…]mple/addbiblio.pl is now UNIMARC with NPL templates (login test/test as well)
11:04 owen kados: my instinct is to say that the js hierarchy is too complex for our users
11:04 Particularly when the 'nodes' hardly ever contain more than one item each
11:04 kados owen: well that will change
11:07 owen How so?
11:09 kados well ... I can't explain the specifics because I haven't figured it out completely
11:09 owen :)
11:09 kados thd and I spent a good deal of the weekend pondering LC subjects
11:10 they present some pretty interesting puzzles :-)
11:10 owen: there were 5 original proposed ways to nest a given subject in a hierarchy:
11:10 http://kados.org/subject_hierarchy.html
11:10 and now we discovered OCLC's FAST project
11:11 so that's a 6th that I haven't taken the time to discect yet
11:11 but one of those 6 will be used
11:11 thd kados: and Chan used a non-LC heading as an example
11:12 kados thd: right ...
11:12 thd kados: so I have erred in the absence of an authority file
11:13 kados thd: to err is human :-)
11:13 :-)
11:14 owen I like the 'narrow results by' options on the NCSU site
11:15 But I'm confused about what we're trying to do with facets...expand or narrow or both?
11:15 thd kados: there are still 5 examples did you mean 5?
11:15 kados good question
11:15 thd: the sixth is FAST from OCLC
11:15 owen: good question
11:15 thd kados: what I am hoping we would do is change
11:15 kados: FAST does nothing for legacy records
11:16 kados thd: good point
11:16 owen: well, I think for the OPAC, expanding is probably the goal
11:16 owen: for the Intranet, I think staff will want narrowing
11:16 owen: so both :-)
11:16 thd kados: FAST is a system for how records might be subject coded when none of the cataloguers know how to subject code any longer
11:17 kados thd: that description fits NPL :-)
11:18 owen I like that the NCSU sidebar: is a simple list; has limited results for each facet; has a 'show more' link
11:18 thd kados: the problem with FAST is that all subject strings are very short so there is no way to specify the two narrow questions that a work may treat
11:19 kados owen: yep, we could do it that way
11:19 owen: the js was just for fast prototyping
11:19 owen I also like the way NCSU handles the narrowing process: you can delete the additional search terms by clicking an X in the results screen
11:19 kados owen: and I think they use FAST for display, or a subset of it
11:19 thd kados: when everything is top level you have no way to associate which strings belong together.
11:20 kados: FAST is subject soup.
11:20 kados owen: where is that?
11:21 thd kados: LCSH are a mess but a reasonably precise mess.
11:21 owen Above the search results:
11:21 "Search 'something': Early works to 1800 [remove] : England [remove] : Doctrines [remove]
11:21 We found 2 matching items. "
11:21 kados owen: I also like the sutle use of color for checked out vs available
11:21 owen: ahh, very nice, I see it now
11:21 owen: we could easily do that
11:24 owen I think the Queens Library search is way too complex
11:27 kados: How can I best be of help?
11:33 thd owen: what is too complex about a search that is too simplistic?
11:34 owen What do you mean thd?
11:34 thd owen: I would agree that it is not intuitive
11:35 owen: I think the progressive hierarchy of subject subdivisions implied by the Queens library search is mistaken
11:35 kados owen: paul has just fixed addbiblio in rel_2_2
11:36 owen: so one thing we need to do is sync default and npl templates
11:36 owen Okay.
11:36 thd owen: I think that the search should be faceted by place topic time and form
11:38 owen: I think that such subject elements from the same facet should be grouped together in the same facet for extending the search
11:39 owen: I suspect that kados will not go so far as adding the facility to browse for other terms to use in the same facet but I will in future
11:43 tumer hi all
11:44 kados:i am in pain with mikes answer
11:44 kados tumer: I can imagine :(
11:45 tumer so no more holdings records
11:45 kados tumer: even if we stored them in the same db with different indexing rules
11:45 tumer: I think two queries is too many :(
11:46 tumer i am reverting back to dev_week
11:46 paul the next question being : does it mean we forget storing the issues informations in zebra too ?
11:46 tumer paul:no
11:46 we already do that
11:47 paul but it seems updating biblio + item for each issue/return is too much CPU consumming isn't it ?
11:47 s/seems/seemed/
11:47 kados it might be
11:47 plus in dev week, we do:
11:47 * store in items table
11:47 tumer that is another matter i am investigating
11:47 kados * store in zebra
11:47 so two operations with each circ, rather than just one
11:48 tumer waiting to see if it is windows issue only
11:48 paul store in items or issues table ?
11:48 kados tumer: waiting to see if what is windows issue only?
11:48 paul in 2.2, it's just in issues & calculated on the fly
11:48 kados paul: sorry, issues
11:48 paul ah, ok.
11:49 kados if we want to search by availability, we must store in zebra
11:49 and several clients want this feature
11:49 so the next question is: how can we speed up saving records to zebra?
11:49 paul by doing a commit only once every 10mn with shadow DB, iirc
11:49 kados one idea I had was to do a batch 'commit' operation once every 5 minutes or so
11:50 paul: you beat me :-)
11:50 paul so, no more commit in koha, but in crontab.
11:50 kados right ... one possible solution
11:50 paul sounds like an accepatble plan to me.
11:50 kados but it should be a syspref
11:50 IMO
11:50 because code already exists to commit with every operation
11:50 paul I agree, because for small libraries it won't be a problem
11:51 kados right
11:51 paul (or for large ones, with a few circ)
11:51 kados yep
11:51 and for libraries where accuracy is more important than speed
11:51 paul so I was right when I heard the systempref crying, 5mn ago ;-)
11:51 kados :-)
11:51 tumer i already added a batch commit sysypref
11:52 paul tumer is faster than lucky luke
11:52 (if you know this cartoon)
11:52 tumer yep
11:52 but my pistol is aching today
11:53 thd owen, kados: see Norgad et. al.  The Online catalog : from technical services to access service. In Advances in librarianship. v. 17.
11:54 the title does not do justice to the content
11:54 kados heh
11:56 thd: I will probably not have time to work on facets today
11:56 thd: maybe later this evening or tomorrow
11:56 thd kados: well later this evening or tomorrow would be good
11:56 kados thd: :-)
11:59 paul bye bye everybody, see you on thursday

← Previous day | Today | Next day → | Search | Index

koha1