IRC log for #koha, 2006-09-07

← Previous day | Today | Next day → | Search | Index

All times shown according to UTC.

Time Nick Message
12:07 kados thd: you around?
12:08 thd: I seem to recall you had the low down on how to get cheap bibliographic and authority records ...
12:26 hdl tumer around ?
12:27 seems not.
12:43 thd kados: I am here
12:43 kados thd: do you see my question above?
12:44 I'm interested in pursuing our discussions about obtaining cheap bib and auth records
12:45 thd kados: yes are you asking about records matched to the records in an individual library or a large database to serve the needs of many libraries?
12:46 kados I'm looking for a cheap way to obtain all the MARc records ever created :-)
12:47 thd ok I will supply them all in five minutes
12:56 kados thd: list?
12:57 thd kados: how many all do you want in terms of countries and libraries where they were created?
12:58 and languages?
13:06 kados I want everything :-)
13:06 well ... options are always good
13:06 ultimately I want everything
13:06 but it would be good to start with a subset
13:06 say, US records only
13:06 to start with
13:07 thd kados: English only?
13:08 kados: LC only?
13:08 kados not LC only
13:08 not English only
13:10 thd kados: targeting selection of particular records from other libraries is OK but wholesale acquisition of records from a database could get you in trouble.
13:11 kados: however, people are willing to sell databases of records
13:11 kados: or rather license their use for approved purposes
13:12 kados right
13:12 thd kados: what do you intend to do with the records?
13:12 kados well I'd be looking for someone willing to sell access to the entire set of data
13:12 so that I can set up http://kohacat.org
13:13 which will allow people to search for and download records usigna  web-bsed interface as well as Z39.50
13:13 thd kados: which would be open to any library just Koha users, LibLime clients or what?
13:18 kados: are you still there?
13:18 kados yes
13:18 it would be open to the world
13:21 thd kados: would there be a fee associated for access or downloading records?
13:21 kados no
13:21 it would be donation based
13:22 thd kados: libraries would have to donate a record for every one they copied?
13:22 kados hehe ... now
13:22 no even
13:22 there would be no obligations
13:24 thd kados: you might be able to make a deal with some sources for databases which they guard carefully but for free open to the world for copying you would not be able to get the best records without replacing their current revenue stream from their records.
13:24 s/best/some very good database sets/
13:25 kados so the only solution is to purchase directly from LC?
13:25 thd kados: no
13:25 kados what then?
13:26 thd kados: however, you would not be able to get many records which were not in LC from aggregators with that model
13:27 kados: you could get LC records with some other modest databases from a third party cheaply
13:28 kados how cheaply?
13:28 thd kados: to complement that set in the US you would need to make deals with individual libraries
13:29 kados I currently have about 10 million records
13:40 dce_wi hello.  Does anyone know the correct way to encrypt passwords for koha?  I did a bulk import of patrons but clearly don't have the passwords encrypted right yet.
13:49 owen Isn't sure... are koha passwords encrypted with MD5()?
13:53 dce_wi They don't look like standard MD5 crypts (non-standard characters, too short and mixed case.)
13:54 owen I'm not much at Perl, but it looks like the scripts use a special function to encrypt the passwords before they go into MySQL: md5_base64() ?
14:01 dce_wi Well, that certainly returns text that looks like what is stored in the database.
14:04 Bingo.  After updating a student's password using that I can log in as them.  Thanks!
14:04 owen glad I could help
14:18 dce_wi \quit
14:18 lol
14:18 l8er
15:43 chris morning
15:53 kados hey chris
15:54 chris heya .. i have a question for ya
15:55 with dev_week ... if i had a bunch of marc records .. and had set up dev_week wrt the page on the wiki
15:55 would i be able to bulkmarcimport them in?
15:56 or do i need to bulkmarcimport them then load them into zebra .. or even bulkmarcimport them into 2.2.x copy the db to my dev_week, run the convert scripts then load the marc files into zebra ?
15:56 ok thats more than one question :)
16:05 and when youve digested that, theres finally some interesting discussion on ngc4lib :)
16:12 ohh and evegreen is live today eh
16:45 kados yes, bulkmarcimport should work
16:46 it should import into koha tables, sql and index in zebra
16:46 chris cool thanks
16:46 kados but you won't be able to reindex the set without exporting first
16:46 and yea, I've been following the ngc4lib bit
16:47 chris ahh right
16:47 second question then ... can i get a bunch of marc records from you :-)
16:47 if you have a library that doesnt mind us using them for testing
16:47 kados should have them already
16:48 lemme check for the link
16:48 chris ahh did bob email and ask for some?
16:48 kados yea
16:48 chris ahh sweet as, ignore me then
16:49 kados http://liblime.com/public/192records.utf8.mrc
17:29 hdl chris : what is ngc4lib ?
17:30 chris hi hdl
17:30 hdl hi chris.
17:30 chris its next gen catalogue for libraries, mailing list
17:30 mostly its people complaining about stuff
17:31 but sometimes it has good discussions
17:31 ill find the link
17:31 hdl seen crayon has much chance to be a girl GOOOD.
17:31 dewey I haven't seen 'crayon', hdl
17:31 chris http://dewey.library.nd.edu/ma[…]ng-lists/ngc4lib/
17:32 yep, we'll find out in a few weeks for sure
20:36 speedyfs Hi
20:37 chris hi
20:37 dewey hello, chris
20:37 speedyfs my libraryan seems to be having a problem adding books to koha
20:38 chris in what way?
20:38 speedyfs if i got to
20:39 add marc record with framwork and type in the isbn
20:39 it says Z3950 search results nothing found
20:40 still ?? requests to go
20:40 im not sure y that is
20:40 chris have you started the z3950 daemon running?
20:41 speedyfs how would i start it
20:42 chris i guess not then :)
20:42 speedyfs lol
20:42 chris 2 secs ill find the mail that explains
20:42 it
20:43 cd /usr/local/koha/intranet/scripts/z3950daemon  (if thats where its installed on your system)
20:43 ./z3950-daemon-launch.sh
20:43 probably need sudo, or to be root to do that
20:45 speedyfs ./z... starts it
20:45 chris cool
20:45 speedyfs sorry that was a ?
20:45 doing that will start it?
20:45 chris yes
20:45 http://lists.katipo.co.nz/publ[…]/2006/009876.html  
20:45 is probably helpful too
20:45 speedyfs cool
20:46 do i have to restart anything?
20:46 chris not if you had set all your z3950 things up in the admin interface already
20:48 speedyfs hmm....
20:48 i have the library of congress server in the servers list
20:48 i guess somethings not set up right
20:49 chris whats the ranking on it?
20:49 Did you leave the ranking on the list as zero?  If so, that server will
20:49 > not be
20:49 > searched.
20:49 so maybe change it to 1
20:50 and restart the z3950 daemon
20:50 you can check if its running ok
20:50 by looking in /usr/local/koha/log
20:51 there will be a file there z3950-daemon-something-something.log
20:53 speedyfs cant locate net/z3950.pm
20:53 chris there we go
20:53 you'll need to install that to get z3950 working
20:54 you can get it from cpan
20:54 speedyfs i installed it
20:55 it says it was aborted on line 11 of processz3950queue
20:55 chris yep thats teh use use Net::Z3950; line
20:56 perl -MNet::Z3950 -v
20:56 what does that tell you
20:57 hmm actually ignore that
20:58 perl -MNet::Z3950 -e 'print "hello\";'
20:58 what happens if you run that at the commandline
20:59 speedyfs same thing
21:00 as before
21:00 chris right so its not installed
21:00 speedyfs connot locate
21:00 chris not anywhere that perl can find it anyway
21:02 speedyfs ok
21:11 i tried reinstalling it
21:12 but i doesnt seem to wokr
21:12 chris i seem to remember it was tricky to install
21:14 speedyfs it say net::z3950 is up to date now
21:14 chris and  perl -MNet::Z3950 -e 'print "hello\";' works?
21:15 speedyfs no
21:17 same error
21:17 chris well where the heck did it install it then i wonder?
21:18 perl -MCPAN -e 'install Net::Z3950'
21:18 is that how you installed it?
21:19 speedyfs it said it installed to /usr/local/lib/perl/5.8.7/Net
21:19 chris right
21:19 what does perl -v tell you
21:20 v5.8.7 ?
21:20 speedyfs yes
21:20 chris well i have no idea then
21:20 speedyfs ::sigh::
21:20 lol
21:20 chris seems somethign weird with your perl
21:22 perl -MNet::Z3950 -e 'print "hello\n";'   (its should be, i missed the n) but i bet that will give the same error
21:22 on my machine i get
21:22 chris@orbweb:/var/log/apache2$  perl -MNet::Z3950 -e 'print "hello\n";'
21:22 hello
21:22 dewey niihau, chris
21:24 speedyfs it says hello
21:25 it worked
21:25 chris cool
21:25 try starting your z3950 daemon again
21:25 speedyfs ok
21:26 when i start is anything suppose to come up
21:27 chris no
21:28 speedyfs UR THE MAN YES!!!
21:28 it works
21:28 now
21:28 chris cool
21:28 you might want to see that as an init.d script, so the z3950 daemon is started if the server is rebooted or anything
21:30 speedyfs ok now it checks 3 or 4 times then finds the books then it goes to 404 not found
21:31 chris yeah thats a template error
21:31 it works with teh default templates, there is a bug in the npl ones
21:31 its been fixed, for the next version
21:31 lemme find you the fix
21:35 hmm cant find it
21:35 can u just try chaning to the default templates and check it works there
21:36 speedyfs so if i put it on its default it'll work
21:36 ok
21:37 chris it should yes
21:38 ahh found the fix
21:38 http://cvs.savannah.nongnu.org[…].2.2.2&r1=1.2.2.1
21:38 specficcally
21:38 <!-- TMPL_IF NAME="refresh" -->
21:39 and the matching <!-- /TMPL_IF -->
21:39 need to be added to the template
21:39 or you can use the defulat ones
21:39 hi tumer
21:39 dewey hi tumer is still strugling
21:39 tumer hi chris
21:40 chris hows sunny cyprus?
21:40 tumer very hor
21:40 hot
21:40 speedyfs ok the tmpl default works like you said
21:41 tumer whats new? i have been away for some time
21:44 chris hmm not too much, ive been pretty busy, we just had another library go live on last friday
21:44 so ive been tidying up all the little bits and pieces we missed :)
21:45 how bout with you? busy as always im guessing
21:45 tumer well i better read the logs. Done some holidays and almost finished rel_3.2
21:45 chris woot
21:45 excelletn
21:46 tumer i have to do the acqusitions bit as well though
21:47 chris ahh right
21:47 tumer no more utf8 problems- no more marc all XML
21:47 chris that sounds so much better
21:48 tumer i have 80 library directors coming next week have to show them KOHA
21:49 chris ohh excellent, i hope that goes well
21:49 we have the library conference of nz coming up in august
21:49 sorry october
21:50 tumer by then you probably will be able to show them all the goodies
21:50 chris excellent that would be fantastic, now the library has gone live, ill have some time to do some testing/bugfixing etc if you need that
21:51 tumer yeah great after next week
21:51 chris cool
21:51 tumer nice talking to you i have to sleep
21:52 chris sleep well, cya later
21:52 tumer gnight
21:54 speedyfs thanks chris you fixed that too
21:54 chris cool
21:55 speedyfs yeah im working at a new school and were using linux and like all open source software
21:55 so its kinda hard learning everyting
21:56 i appreciate everyones help
21:56 chris no problem
21:57 speedyfs i hope my librarian has no more problems with it
21:57 thanx gnite
21:57 chris night
02:02 osmoze hello
02:21 btoumi hi all
02:30 toins hi all
02:35 hdl tumer there ?
02:36 good sleep tumer.
02:37 the warrior rest
09:40 tumer hdl:around?
09:40 paul hi tumer.
09:40 dewey hi tumer is still strugling
09:40 paul hdl should not be too far...
09:40 tumer hi paul?
09:40 hdl seems to be looking for me, what up?
09:41 hdl hi
09:41 dewey salut, hdl
09:41 paul I think it was about the koha-devel question
09:41 tumer hi
09:41 dewey privet, tumer
09:41 tumer surgeant dewey
09:41 hdl tumer : it was about koha-devel question and also about utf-8.
09:42 tumer shoot
09:42 hdl UTF-8 first :
09:42 is HTML::Template::Pro enough for utf-8 management.
09:43 It seems to me yes. for what I could judge.
09:43 tumer necessary but  we have to use Encode as well
09:43 hdl When ?
09:43 tumer well for HEAD we have to but rel_2 should be ok with it
09:44 hdl Is it for xml processing without loosing utf-8 encodings ?
09:44 tumer yes for that i have to use Encode
09:44 hdl Is it for DBI utf-8 management ?
09:45 tumer all xml management DB or Zebra
09:45 hdl and use utf8; pragma doesnot help in no way ?
09:46 tumer HTML::Tempalte::Pro solves our CGI problems
09:46 use utf8 is useless
09:46 hdl Have you tried #!/usr/bin/perl -COE  pragma in your 1st line ?
09:46 tumer we actually have to set the flags ourselves
09:47 hdl Why this ?
09:47 tumer hdl: i solved this UTF-8 problem i think and by adding a couple of encode or decode did no harm to new code
09:48 hdl OK. code committed ?
09:48 tumer hdl: you remember MARC problem we didcussed?
09:49 hdl tumer: which one ? It's been a hell of a time we last spoke.
09:49 tumer i was saying MARC was buggy. But its not all our packages for XML does the same thing
09:49 hdl YEAh.
09:49 MARC::Charset seemed to double encode.
09:49 tumer so now i fully follow the strict guidelines
09:50 MARC is functioning correctly, Charset or XML whatever
09:50 its the other packages HTML::Template that was causing the problem
09:50 hdl About your version for HEAD. Are you using zebra version 2.0 ?
09:51 tumer any non UTF8 aware package will create the same problem
09:51 i am using Zebra 2.0 but t works with 1.4 as well
09:52 hdl tumer : yes : that was my conclusion at least for CGI and DBI about.
09:52 tumer The double encoding is due to half our data having flags set (from MARC) and the rest not
09:52 Pro solves that problem
09:53 hdl It seems that zebra2.0 introduced a new xml configuration standard.
09:53 Is not that what you used ?
09:53 tumer yes now batch xml indexing is possible
09:54 by the way i have installed HEAD to our library and have been debugging it last 2 weeks
09:54 ALL XML
09:54 except breeding farm
09:54 hdl I mean, indeed, seems to me that record.abs syntax is quite a cryptic file.
09:55 tumer have you seen the neew indexing file i committed?
09:55 koha2index.xml
09:55 hdl yes.
09:55 tumer more managable
09:55 hdl But I thought it worked only with zebra 2.0
09:55 tumer works with 1.4 but now i upgraded to 2
09:56 hdl COOL.
09:56 tumer 1.4 was never officially released anyeway
09:56 hdl I have tried to make a dtd for frameworks so taht they could be in xml files.
09:57 But still under nightly construction so not too quick.
09:57 tumer well i tried an xml stylesheet for results and it works nicely
09:58 but no time for that now later after i finish debugging
09:59 once i commit i would kikw you to merge your AUthority stuff to HEAd as i did not understand it i could not write it with XML
10:00 hdl Ok.
10:00 I will have to understand your code.
10:00 tumer i will be around
10:01 Currently trying to fix a bug with add biblio and marchtml2xml
10:02 when you add more than one field marchtml2xml puts them all in one field
10:02 I added a new index to addbiblio.pl and solved the problem of adding multiple authorities
10:03 they will have to be  implemented in rel_2 as well

← Previous day | Today | Next day → | Search | Index

koha1