IRC log for #koha, 2007-03-03

← Previous day | Today | Search | Index

All times shown according to UTC.

Time Nick Message
12:58 thd hdl: are you present?
12:58 hdl my internet connection is
12:59 ... fuzzing.
12:59 I may quit unexpectedly.
12:59 But yes.
13:00 thd hdl: does FRANTIQ have a Z39.50 server?
13:00 hdl Not Yet.
13:00 With rel30 may be.
13:01 thd hdl: are you planning to move them to rel_30 ?
13:01 hdl Yes But When it is rock stable
13:02 thd hdl: do you know of any UNIMARC Z39.50 authority record targets?  I need to test my application but I have no UNIMARC targets to use for testing.
16:12 kados: are you there?
16:12 kados thd: I'm at code4lib conference
16:12 thd kados: what subject?
16:13 kados thd: library-in-a-box just finished up
16:14 thd kados:so if you are finished do you have a moment?
16:18 kados::sorry, I forgot that meant that you are away at the conference event
18:50 kados?
18:50 dewey rumour has it kados is helping us LibLime folk with something at the moment
18:50 thd dewey how old is you rumour?
18:50 dewey thd: bugger all, i dunno
18:51 thd quit
19:02 Lea evening/morning. I guessed more would be on at this time of night/day :D
19:03 Is there a "standard" for Subjects? Or should I just let the librarians use anything?
19:03 chris there are standards
19:03 Library of congress has some, but they will all have the wrong spellings for you :)
19:04 im sure the National Libraries in the UK must have a set of subject headings too?
19:04 thd although LC manaiNa
19:04 Lea ok. I tell you my situation, I'm migrating data that has NO subject data. I can do mappings based on existing keywords but would like a list that the librarians can choose from in ambiguous cases. So I'd like a standard list of Subjects.
19:05 uk National Library want to charge for MARC21 data! I was shocked!
19:05 chris oh yeah the NZ national library do it too
19:05 in fact most of them do
19:05 Lea really hard to find MARC21 z39 servers in the UK
19:05 chris yep
19:05 the universities might be your best bet
19:05 Lea everyone is SUTRS mad
19:06 i was hoping to throw all my ISBNs at a bunch of servers and problem soved...
19:06 chris right
19:06 thd Lea: LCSH has been adopted as a standard at the British Library since they abandoned PRECIS and COMPASS
19:07 chris american spellings and all thd? or do they have LCSH version with english english? :-)
19:07 Lea talking of which, i got some nice code linking up pyMARC and pyZ3950 which can pull in all MARC21 data for an isbn and you can access the tags like record["245"]["a"]
19:07 thd Lea: Oxford has a public Z39.50 server
19:07 Lea oxford?
19:07 I'll have a search
19:08 thd Lea: Oxford University I meant
19:08 Lea I have this problem with the barcodes in that some servers have the same books listed under different barcodes. I'm not a librarian, therefore was totally foxed by this problem
19:09 so I'm looking for 12 digit barcode, nothing. Search by title and get the book under a different barcode.
19:09 *13 digit
19:09 thd Lea: I can give you the configuration data for Oxford Universities server
19:11 s/ties/ty's/
19:14 Lea i done it. Thanks
19:14 again, searching on title gave me a hit
19:15 searching on isbn didn't
19:17 thanks for the oxford uni tip. I think that might do it.
19:17 thd Lea: Do you have this library.ox.ac.uk:210/ADVANCE ?
19:17 Lea yeah
19:17 thanks
19:19 Here's my idea: 1. Load in the existing MARC21 data (a bit rubbish) 2. Search oxford on ISBN. If that works, import data and overwrite fields in existing data. If it fails, search by title. return a list of records and store in database. The librarians can then use a simple web ui to select the correct books if multiple hits on title. Sound reasonable?
19:19 thd Lea: I would very much like to know the current Z39.50 connection information for Cambridge University libraries.  My information is incorrect for them.
19:20 Lea I used the same info as you did
19:20 conn = zoom.Connection ('library.ox.ac.uk',210)
19:20 conn.databaseName = 'ADVANCE'
19:20 query = zoom.Query ('CCL', 'isbn=9781904570011')
19:20 res = conn.search (query)
19:20 that worked for me
19:21 what are you using to access it?
19:21 thd Lea: yes Oxford works but my information for Cambridge is incorrect.
19:21 Lea ah, misread. It's late here :)
19:21 thd Lea: matching on ISBN alone is very unreliable
19:22 Lea: i have created a special application for doing what you intend
19:23 Lea ??
19:23 I've had 2 direct hits on title now
19:26 thd: what is this application you speak of?
19:30 thd Lea: you need to search on multiple fields simultaneously for a reliable match
19:30 Lea: how many records do you need to upgrade?
19:31 Lea let me thjink
19:31 I think it's around 16,000
19:31 no, must be less
19:31 ... I'll say that to be on the safe side
19:32 thd Lea: I have an application for doing that automatically from the data in existing records.
19:33 l
19:33 Lea wow, this gets better by the minute :D
19:35 thd Lea: my application works well but has not yet been ported to Perl and Koha because the new Perl Z39.50 module had not been written when i first started
19:37 Lea: I matched about 80% of the records in a small native American Library using the application in the spring
19:39 Lea: Most of the other 20% were specific reorts on that tribe and just impossible to find in another collection
19:39 s/reorts/reports/
19:42 Lea ah ok
19:42 you were looking for american books?
19:43 thd Lea: I have almost all known public UK Z39.50 servers in the client to which I gave you the link
19:44 Lea: at the bottom of the form is a list of targets
19:44 Lea superb. I have asked you a Q in PM
19:46 thd Lea: University of London usually rejects the connection, the British Library requires a password for MARC records
19:48 Lea: the rest should work fine except for Cambridge University as I already stated
19:50 Lea http://www.hero.ac.uk/uk/refer[…]cambridge4579.cfm
19:52 wow i really have to go to bed. NN all. Thanks thd. I think that will be a great asset
21:21 thd chris: do you know who Lea is?
21:21 chris someone from wales :)
21:23 thd chris: Lea wanted me to send something via email but I kept losing the connection from my friend's apartment and I missed her request until she had logged off #koha.
21:24 chris right
21:24 tnb chris: hey :)
21:24 chris heya tina
21:24 tnb how's it going?
21:24 chris thd: dunno her email address im afraid
21:24 thd thanks chris
21:24 chris tnb:good thanks, friday aftenoon .. hows things there?
21:25 tnb pretty good, i'm creating a profile of 'koha users'
21:25 so that is quite a task ;)
21:25 chris i bet
21:25 tnb i heard a rumor that toledo diocese were goign live today, did you hear that rumor?
21:25 :)
21:26 thd tnb: do you mean librarians or patrons?
21:26 tnb thd: libraries :)
21:26 chris well i uploaded their circ data today, so i think they did tnb
21:26 tnb cool.  
21:27 thd: i've been sending out some emails to 'fill in the blanks' and get some of the information that isn't on the koha users wiki
21:27 stats, etc
21:28 there's still alot of work to do on that front
21:28 thd tnb: Do you know if kados is accessable via cellphone at the code4lib conference?
21:31 tnb thd: he has alot of meetings this evening, so he might have it off
21:31 you could try though
21:32 thd tnb: I assume that he attending the conference and that there is no exhibition space.  is that correct?
21:41 tnb thd: yes, that is correct
21:42 rch :q
21:42 wrong window :) :wq
21:44 thd tnb: I wanted to ask him to collect some information from one of the presentations tomorrow
21:49 tnb thd: probably an email would work.  I'm sure he'll check it several times before tomorrow ;)
21:53 thd tnb: thanks, I thought he might ignore email at the conference.  However, I should not assume that my habbits are his.

← Previous day | Today | Search | Index

koha1