Today | Next day → | Search | Index
All times shown according to UTC.
Time | Nick | Message |
---|---|---|
13:04 | kyle | hey kados, you around? |
22:00 | kados | thd: are you present? |
22:01 | thd: I received a question from someone recently regarding a good source for german-language materials | |
22:01 | the national german library only allows SUTRS access to their records :( | |
22:02 | thd | yes Germany is mostly mean about MARC records |
22:02 | kados | thd: is there a way to obtain the records? :-) |
22:03 | thd | kados: there is fortunately a good union catalogue |
22:12 | kados | thd: which one? |
22:12 | dewey | hmmm... which one is that? :-) |
22:13 | thd | GBV |
22:14 | kados | thd: got z39.50 source for that? |
22:14 | thd | Bremen, Hamburg, Mecklenburg-Vorpommern, Niedersachsen, Sachsen-Anhalt, Schleswig-Holstein, Thüringen und der Stiftung Preußischer Kulturbesitz |
22:15 | z3950.gbv.de:20011/GVK|yes|MARC 21|MARC 21|MARC 21|Gemeinsamen Bibliotheksverbund (GBV) | |
22:15 | kados | woot |
22:15 | thd | you need a password |
22:15 | kados | pay for it? |
22:15 | http://www.gbv.de/vgm/info/ben[…]tras_0065?lang=en | |
22:15 | found this site | |
22:16 | thd | $connectArray = array("user"=>"999", "password" => "abc"); |
22:18 | kados: they do restrict table of contents to and most databases to network members | |
22:19 | kados | thd: db name? |
22:19 | thd | port 20011 DB GVK |
22:20 | kados | thd: cool, it seems to even work from koha :-) |
22:20 | thd: I added it to koha.liblime.com | |
22:21 | but they use a weird encoding | |
22:21 | thd | kados: there is another very large network but that is for brief UNIMARC records only and full UNIMARC records if you are a member |
22:22 | kados | thd: their encoding is problematic |
22:22 | thd | kados: did you use the correct port |
22:22 | kados | thd: they don't use utf-8 or marc-8 |
22:22 | thd | port determines encoding |
22:23 | kados: if you have the wrong port they will be in ISO-5426 | |
22:24 | kados | thd: I used 20011 |
22:24 | thd: and the encoding isn't marc8 or utf8 | |
22:24 | :( | |
22:24 | encoding-- | |
22:27 | thd | still have not found ports but supported attributes are http://www.gbv.de/en/services/[…]=2.1@kavia.gbv.de |
22:30 | ports and encoding http://www.gbv.de/vgm/info/ben[…]tras_0065?lang=en | |
22:31 | kados: Ansel is MARC-8 | |
22:31 | kados | hmmm |
22:32 | so since I'm using that port | |
22:32 | it must be that Koha isn't importing properly :( | |
22:33 | encoding-- | |
22:33 | thd | I was going to suggest Koha as the problem |
22:33 | kados | hehe |
22:33 | thd | kados: I have tested this server thoroughly with my own client |
22:34 | kados | ok ... so it's koha :( |
22:36 | thd | kados: German records do have capital letters in their indicators and have some field assignment mistakes in converting to MARC 21 but they are otherwise OK |
22:40 | kados: I recommend the subroutine I added to bulkmarcimport.pl for all MARC-8 to UTF-8 conversion | |
22:43 | kados: I do have a nice German music reviews server but some other union catalogues have expired passwords posted in the official place | |
22:52 | kados: do test on more than one search. Almost no server has every record encoded correctly. | |
05:03 | btoumi | hi all |
05:34 | hdl | kados have you checked with koha-2.2.7 z3950 client ? |
05:35 | I tried and fix encoding problems with french libraries. | |
05:35 | kados : maybe you can use what I did for latin1 for ISO-5426 | |
06:27 | js | hi #koha |
06:59 | tumer | paul:around? |
07:00 | toins:around? | |
07:00 | toins | hi tumer ! |
07:00 | yep i'm around ;-) | |
07:00 | tumer | hi toins |
07:01 | u are using mod_perl? | |
07:01 | toins | yes |
07:01 | and it's rocks ! | |
07:01 | tumer | can you send me the http.conf part for koha? |
07:01 | toins | ok |
07:02 | tumer | i am having lots of authentication problems |
07:02 | did you change any koha scripts? | |
07:03 | toins | nope, i have just correct some mod_perl specific bugs |
07:03 | but now all is good | |
07:03 | (on rel_3_0) | |
07:04 | tumer | on mine everyone gets logged in as previous user and all gets mixed up |
07:04 | toins | #intranet |
07:04 | Listen 8079 | |
07:04 | <VirtualHost gerard:8079> | |
07:04 | PerlRequire /etc/apache2/rel_3_0.pl | |
07:04 | ServerAdmin antoineafarno.com | |
07:04 | DocumentRoot /home/toins/dev/rel_3_0/koha-tmpl | |
07:04 | ScriptAlias /cgi-bin/koha "/home/toins/dev/rel_3_0" | |
07:04 | ServerName gerard | |
07:04 | ErrorLog "/var/log/apache2/rel_3_0-error.log" | |
07:04 | Redirect permanent index.html http://gerard:8079/cgi-bin/koha/mainpage.pl | |
07:04 | SetEnv KOHA_CONF "/etc/koha/rel_3_0.xml" | |
07:04 | </VirtualHost> | |
07:05 | tumer | and rel_3.pl what does that include? |
07:05 | toins | rel_3_0.pl is : |
07:06 | #!/usr/bin/perl -w | |
07:06 | use strict; | |
07:06 | use Data::Dumper; | |
07:06 | use Apache::DBI; | |
07:06 | use lib qw(/home/toins/dev/rel_3_0); | |
07:06 | 1; | |
07:06 | use Apache::DBI is not required | |
07:07 | tumer, and i have on /etc/apache2/apache2.conf : | |
07:07 | <Files *.pl> | |
07:07 | SetHandler perl-script | |
07:07 | PerlResponseHandler ModPerl::Registry | |
07:07 | PerlOptions +ParseHeaders | |
07:07 | PerlSendHeader On | |
07:07 | Options +ExecCGI | |
07:07 | </Files> | |
07:08 | tumer | strange, I have very similar but it really is messing up! |
07:09 | thanks anyway | |
07:09 | i will give it another try | |
07:10 | toins | there is a page on the wiki : http://wiki.koha.org/doku.php?id=mod_perl |
07:10 | tumer, you are running koha head ? | |
07:10 | tumer | yes toins |
07:11 | but i have not commited lates changes for sometime now | |
07:20 | paul | hi tumer |
07:20 | a quick question... | |
07:20 | tumer | hi paul |
07:20 | yep! | |
07:20 | paul | iirc, you did something to let the users save their queries isn't it ? |
07:21 | tumer | yes |
07:21 | paul | is the code commited on HEAD ? & could you tell me where it is |
07:21 | next question : is your "see where is it in the library (on a "map")" commited too ? | |
07:21 | & where is it hidde too ;-) | |
07:21 | tumer | search.pm and it is.. |
07:22 | paul | last question : did you commit somewhere your update_zebra script, that runs every 30seconds |
07:22 | ? | |
07:22 | tumer | sub add_query_line |
07:23 | i'll check whether i committed zebra script | |
07:24 | paul | mmm... add_query_line is not what i'm looking for |
07:24 | iirc, the user could save a query he did previously, to be able to quiclky find it again later. | |
07:24 | tumer | zebra script is under z3950 zebra_queue_start.pl |
07:25 | add_query line saves the query which can be recalled after | |
07:25 | i have not committed any opac scripts that just calls it back | |
07:26 | all querys of the user are saved automatically | |
07:26 | paul | zebra_queue_start unknown on my head cvs |
07:27 | tumer | lemme check |
07:27 | paul | lemme check my cvs copy too ;-) |
07:27 | (as it's quite outdated I bet | |
07:27 | ) | |
07:27 | tumer | its named zebraqueue_start.pl |
07:28 | paul | ok, gotcha, thx. |
07:28 | and the map feature, is it somewhere ? | |
07:29 | tumer | on windows i use another script to start that and fork out-winwos specific as i cannot use fork |
07:30 | paul | zebraqueue_start.pl => your added a zebraqueue table, that is filled by AddBiblio / ModBiblio & emptied by zebraqueue_start.pl, right ? |
07:30 | tumer | paul:i did not commit any opac scripts as i only have my templates with them |
07:30 | yes thats right | |
07:31 | paul | thanks, i think it's a very clean way of doing things & i'll probably adopt it on rel_3_0 |
07:32 | thd | tuner: I have not been able to obtain a non-error response from your Z39.50 server |
07:33 | library.neu.edu.tr:9010|no|MARC 21|UTF-8|MARC 21|Yakın Doğu Üníversítesí (YDÜ) [Near East University (NEU)], Lefkoșa, Kuzey Kıbrıs Türk Cumhuriyeti (KKTC) [Nicosia, Turkish Republic of Northern Cyprus (TRNC)], Büyük Kütüphane [Grand Library] | |
07:35 | tumer: I have always had error: "database unavailable", error detail : "default" from that server | |
07:37 | tumer | thd database name is neulis |
07:38 | thd | you changed it from default? |
07:38 | tumer | its always been like date |
07:38 | s date/that | |
07:39 | thd | I misunderstood then previously |
07:39 | tumer: your wAP service is nice | |
07:40 | tumer | thx |
07:40 | but SMS renewal service is used most | |
07:41 | thd | tumer: Is there no secondary option page WAP for viewing more complete bibliographic information such as edition, date, and extent to know if one has the correct edition of the title? |
07:43 | tumer | thd:it can be improved but we are side tracked at the moment adding federated search for EBSCO and ProQuest databases |
07:43 | thd | tumer: what are you using for Federated searching? |
07:44 | tumer | written it myseld with zoom |
07:46 | thd | tumer: what are you doing for duplicate biblios in EBSCO and ProQuest? |
07:46 | tumer | nothing |
07:47 | thd | tumer: are you able to obtain any thesaurus information from either to guide user queries? |
07:48 | tumer | scan operations are allowed yes |
07:48 | thd | tumer: I mean are you able to obtain thesaurus information to present to the user before the search is conducted? |
07:49 | tumer | ?? |
07:50 | thd | tumer: such that you could have the user query constrained by selecting from actual thesaurus contents? |
07:51 | tumer | you may do that if required |
07:52 | thd | tumer: they amalgamate indexing and abstracting services. Are you able to obtain the indexing and abstracting services individual thesauri from EBSCO or ProQuest? |
07:52 | tumer | i have not asked them! |
07:53 | thd | tumer: would you consider asking about the possibility |
07:54 | tumer | they will probably tell me to buy their federated searching product |
07:55 | thd | tumer: their own product does not provide the thesaurus contents that I have seen except in a specific result set. |
07:57 | tumer: if you had the thesaurus information yourself you could build the best possible search of either, federated or otherwise. | |
07:59 | tumer: we should be able to do something similar for subject headings in the OPAC in due course. | |
08:01 | tumer: in the case of the OPAC we have subject authorities records for thesauri, but in the case of indexing and abstracting services like EBSCO and ProQuest we have almost nothing for thesauri. | |
08:03 | tumer | you may scan their subject field,download and built your own |
08:04 | thd | tumer: would they allow that? |
08:05 | tumer: would they supply better, hierarchical thesaurus information if you asked? | |
08:05 | tumer | i do not think so. But downloading of the whole lot is allowed. They even provide all records on a CD |
08:06 | thd | tumer: complete with authorised and unauthorised forms of a subject? |
08:06 | tumer | to be added to local catalogue |
08:06 | thd | tumer: how many records is that? |
08:06 | tumer | nope only bibliographic with subfileds 650 |
08:07 | thd | tumer: do their databases fit on one CD? |
08:07 | tumer | one or more i donno |
08:08 | thd have to go, see you again | |
08:08 | thd | tumer: I assume they have no provision for full text in your own OPAC |
08:08 | OK | |
08:09 | thank you for that information tumer | |
08:36 | tumer: if you are around, I also have database unavailable errors for neulis using library.neu.edu.tr:9010/neulis | |
08:36 | and I have those errors even if you are not around :) | |
08:41 | tumer | thd: it seems the server is down ,thx |
08:41 | thd | tumer: you are welcome |
09:03 | ------ |
Today | Next day → | Search | Index