← Previous day | Today | Next day → | Search | Index
All times shown according to UTC.
Time | Nick | Message |
---|---|---|
12:59 | kados | hi all |
13:01 | paul: can you help me to get authorities/thesaurus working for koha.liblime.com? | |
13:03 | hdl: are you here? | |
13:03 | hdl | yes |
13:03 | If I can help | |
13:04 | kados | I would like to get authorities/thesaurus working for koha.liblime.com |
13:04 | but I don't understand how to do so ... | |
13:04 | hdl | Do you hav authorities ? |
13:05 | Can you search them ? | |
13:06 | kados | I have not done anything with authorities |
13:06 | don't have authorities | |
13:07 | can't I build authorities list from existing data? | |
13:11 | hdl | You can if you use rebuild_thesaurus ? paul pls confirm.... |
13:12 | or with the script I sent you. | |
13:12 | But you have to have subjects in your biblios. | |
13:14 | see build_authorities.pl | |
13:14 | with NC : (common nouns) for subjects. | |
13:15 | Subjects are supposed to be contained in 606$a | |
13:15 | And authority will be constructed in field 250. | |
13:16 | NC : Noms Communs in French. | |
13:16 | You should create a new thesarus category called .... | |
13:16 | with an authtag to report, a summary... | |
13:17 | kados | in MARC21 I think subjects are spread throughout 6XX fields |
13:17 | maybe concentrated in 650$a | |
13:19 | paul | hello everybody |
13:19 | joshua : you'll see in koha-devel that tümer really rocks. | |
13:20 | he just sended me a file for a better inventory system. | |
13:23 | thd | kados: subjects are also in 6XX in UNIMARC |
13:24 | kados | excellent! |
13:24 | paul | (hdl : faire un tour dans ta BAL) |
13:25 | kados | paul: what does it mean 'inventory system'? |
13:27 | thd | kados: I have made Koha authorities work before and could explain how. Building them requires making MARC 21 versions of the UNIMARC building scripts. |
13:28 | kados | thd: ok |
13:28 | thd: do you have those? or shall I build my own? | |
13:28 | thd: also, will it solve the '650x' problem? :-) | |
13:28 | :-) | |
13:29 | thd | kados: However, as I suggested before you could import real rich authorities and with a working bulkautimport.pl. |
13:30 | kados: What is the 650X problem? | |
13:30 | kados | thd: did you see the explaination on the koha list? |
13:31 | thd: what's your email address? | |
13:31 | thd: kohaalinto.com ? | |
13:32 | thd: just sent you two emails | |
13:33 | thd: sent to me by Brooke Johnson | |
13:33 | thd: which explain the '650x' problem | |
13:33 | thd: though I can't say I fully understand it | |
13:34 | thd | kados: do you mean the problem where 650 $a 650$a $x 650 $a $x looked like 650 $a$a$x$a$x in the MARC view but now fixed with a preference? |
13:34 | kados | thd: no |
13:34 | owen | kados: I didn't see an explanation on the koha list, I just saw your reply |
13:35 | kados | right ... just realized she didn't forward it to the list ... dou! |
13:45 | thd | kados: Yes, I had understood that problem as being fundamental to how the indexes work in Koha 2.X. |
13:47 | kados | I've forwarded it to the devel list |
13:47 | 'indexes' being the marc_word table :-) | |
13:47 | thd: could you explain the problem again to me? | |
13:47 | owen: I'll forward them to you too | |
13:48 | thd | kados: That is why it is important that a search for John Smith will not match a book co-authored by Fred Smith and John Lake. |
13:49 | kados: under Koha 3.0 the above should not match. | |
13:49 | kados | right |
13:49 | because subject searching will be a 'phrase search' | |
13:49 | not a 'word search' | |
13:49 | if I"m understanding correctly | |
13:49 | I still dont' get what 605x has to do with anything | |
13:49 | 650x even | |
13:50 | thd | kados: Unless the user selects a special checkbox to add the statement of responsibility to the search. |
13:51 | kados: It could be good to have that extra user option for finding editor, illustrator, and other names that often do not appear elsewhere in the record. | |
13:52 | kados: paul even has a library which has been using the statement of responsibility as the exclusive place to Esther author information. | |
13:53 | s/Esther/enter/ | |
13:54 | kados: to explain the 650 x problem think of how Koha 2 uses its indexes. | |
13:54 | kados | hmmm, I guess I just don't understand subjects well enough to understand |
13:54 | should 650x be treated differently than 650a? | |
13:55 | thd | kados: You could even use author names as in my example. |
13:56 | kados: however equally two or more authors contribute to a book only one is allowed in the main author entry 1XX. | |
13:57 | kados: The others have to go in an additional authors fields 7XX. | |
14:01 | kados: The Koha bibliographic framework for MARC 21 should have 100 $a linked to biblio.author or whatever that column is. | |
14:05 | kados: 700 $a would be linked to additionalauthuthors.author however you can really ignore that link altogether. | |
14:06 | kados: Ignore that especially so that the explanation does not seem unduly complex. | |
14:07 | kados: author searching in Koha 2 does not require additonalauthors.author | |
14:08 | kados: Author searching uses the see also references in 100 $a | |
14:10 | kados | owen: did you get the messages? |
14:10 | thd | kados: so in the see also references you should have other 1XX in case the main author was not a personal author, as well as 7XX to catch all the additional authors. |
14:10 | owen | Yes |
14:10 | kados | owen: any light to shed? :-) |
14:12 | thd | kados: So when Koha searches for the author what it is really searching on is a list of all the words in the see also subfields mushed together. |
14:14 | kados: what it ought to do is to search each separate field in the see also list separately to find matches. | |
14:15 | owen | Actually, kados, I'm finding it a little hard to understand what kind of searches Brooke is using as examples |
14:15 | kados | owen: glad I'm not the only one :-) |
14:15 | thd: that's 'phrase' searching rather than 'word' searching | |
14:15 | paul | kados : why does http://127.0.0.1:9006/cgi-bin/[…]etail.pl?bib=2946 |
14:15 | kados | thd: I get that part |
14:16 | paul | give me a "no image avaialble" ? |
14:16 | kados | ? |
14:16 | paul: 127 is non-routable | |
14:16 | paul | (bureau.paulpoulain.com instead of 127.0.0.1 i mean |
14:16 | kados | ahh :-) |
14:17 | paul | (i've reintroduced the - in the isbn, but not for amazon service call) |
14:17 | kados | paul: I get 'Object not found!' |
14:17 | paul: can't load the page | |
14:17 | paul: I think the amazon module strips out the - automatically | |
14:17 | paul: but I can check ... hang on | |
14:17 | paul | mmm... could my firewall be closed ? |
14:17 | thd | kados: So in the example of an author search for John Smith it should match only if John Smith were in a single 1XX or a single 7XX but will match a book co-authored by Fred Smith in 100 and John Lake in 700. |
14:18 | paul | object not found => where ? when reaching bureau.paulpoulain.com:9006 ? |
14:18 | or in amazon ? | |
14:18 | thd | kados: the issue is not about phrase matching. |
14:18 | kados | bureau.paulpoulain.com:9006 |
14:19 | thd | kados: Phrase matching would only make the search more unreliable. |
14:19 | paul | everything open. does it work better now ? |
14:20 | thd | kados: the issue is about doing word matching using the contents of only one field at a time. |
14:20 | kados | paul: no :/ |
14:20 | paul | strange |
14:20 | kados | http://bureau.paulpoulain.com/[…]etail.pl?bib=2946 |
14:20 | is url right? | |
14:20 | paul | hehe. |
14:20 | NO | |
14:20 | kados | ahh |
14:20 | paul | it's :9006 |
14:20 | bureau.paulpoulain.com:9006 | |
14:20 | thd | kados: There are however phrase matching issues for subjects. |
14:20 | kados | got it |
14:21 | paul: so what's the problem? | |
14:21 | paul | + it seems there is no handling of "SIMILAR_PRODUCTS" in npl templates |
14:21 | kados | paul: you expect an image to show up for that book? |
14:21 | paul | there is no picture, even if the isbn is valid & english |
14:21 | yes. I expected | |
14:22 | kados | paul: probably amazon does not have an image for this item |
14:22 | paul | hehe... |
14:22 | it works... | |
14:22 | http://bureau.paulpoulain.com:[…]etail.pl?bib=9767 | |
14:22 | has a review ! | |
14:22 | kados | paul: :-) |
14:23 | still no image though :-) | |
14:23 | owen | kados, I do see /part/ of the issue that Brooke explains |
14:23 | kados | paul: add the opacsearchresults.tmpl amazon images stuff |
14:23 | paul: to default, then you can tell which items will have images from the results page | |
14:24 | owen: do tell | |
14:24 | owen | Took me a bit to find an example, but look here: http://search.athenscounty.lib[…]ail.pl?bib=101805 |
14:24 | The linked subjects are 'flour mills,' 'grain,' and 'frontier and pioneer life' | |
14:25 | kados | right |
14:25 | owen | But if you look at the MARC record, you see that the actual subjects are more complex: |
14:25 | Flour-mills -- History -- Juvenile literature | |
14:25 | kados | right |
14:26 | I don't quite get the concept of a subdivision | |
14:26 | thd | kados: Owens example is using an old version of Koha. |
14:26 | kados | thd: right |
14:26 | owen | So following the 'flower-mills' link won't really give you matching items, it will give you a more general set |
14:26 | paul | YESSS... search "linux" |
14:26 | owen | The subdivisions are refining characteristics |
14:26 | kados | paul: wohoo! |
14:27 | paul: still no image on details screen | |
14:27 | paul: make a new variable | |
14:27 | paul: called 'amazonisbn' | |
14:27 | owen | So, not just, "Flour mills," but /history/ of Flour Mills. And not just history of flour mills, but /juvenile literature/ history of flour mills. |
14:27 | kados | paul: that s/-//g; |
14:28 | paul: should solve the detail screen prob | |
14:28 | paul: don't forget about the 'search inside' feature | |
14:28 | owen: it seems to me it should be: | |
14:29 | Juvenile literature | |
14:29 | | | |
14:29 | --- History | |
14:29 | | | |
14:29 | thd | paul: please commit that code if you have fixed a breakable aspect of amazon web services data. |
14:30 | kados | ---- Flour mills |
14:30 | owen: is that right? | |
14:30 | owen: or are Juv lit and Hist at the same level of the hierarchy? | |
14:31 | owen | I think it just depends on how you look at it. The classification as it is in this example leads with the most useful "entry point" |
14:32 | But it is an unusual way to look at a hierarchy if you're not used to the librarian-way | |
14:32 | kados | I guess I just don't understand how anyone could formulate a search in a library catalog using that scheme |
14:32 | thd | owen it is partly hierarchical and partly faceted. |
14:33 | kados | without somethig like CQL |
14:33 | or the advanced boolean MARC search | |
14:34 | paul: when you get a sec ... I have some data without entries in the 090$c ... and bulkmarcimport doesn't seem to add these entries in head | |
14:34 | thd | kados: Did you understand my explanation about author searching and why it is not the lack of phrase searching that is the problem? |
14:35 | owen | kados: this issue is probably part of the reason why OPACs are thought to be hard to use |
14:35 | kados | paul: wait ... not sure that is the problem |
14:36 | paul: here's the error: | |
14:36 | DBD::mysql::st execute failed: Column 'biblioitemnumber' cannot be null at C4/Biblio.pm line 1390. | |
14:36 | MARC::Record=HASH(0x8b1151c) at C4/Biblio.pm line 1454. | |
14:36 | 952 at C4/Biblio.pm line 1455. | |
14:36 | MARC::Field=HASH(0x8b3e79c) at C4/Biblio.pm line 1456. | |
14:36 | owen | Imagine how much better searching we could offer if we indexed the content of each book the same way Google indexes the content of each web site? |
14:36 | paul | sorry kados, but I really have to leave now. |
14:36 | kados | paul: ok no prob |
14:36 | paul | maybe on monday. |
14:36 | kados | paul: I'll investigate |
14:36 | paul: have a good weekend | |
14:36 | paul | (tuesday - thursday, i'll be in Paris) |
14:36 | kados | k |
14:36 | thd | owen: Google hardly uses fielded indexes at all. |
14:36 | paul | thursday = working with Pierrick, that begins it's job at Ineo ;-) ) |
14:37 | kados | woohoo |
14:37 | thd: I'm trying to understand the subject prob first :-) | |
14:37 | thd | owen: a Google search on library records would be this problem multiplied many times over. |
14:37 | kados | i guess I don't understand what brooke expects |
14:38 | owen | kados: At the very least, to have subjects link properly from the detail screen |
14:38 | I'm not sure what the answer is in terms of searching. | |
14:39 | thd | kados: They are the same problem, however subjects can add an additional layer of complexity to the problem. |
14:39 | owen | It's part of the reason why it might be better to offer a completely different kind of search when it comes to subject |
14:40 | kados | I liked the old Koha subject headings search |
14:40 | where you do a search, it pulls up a list of possible headings | |
14:41 | and you pick what you want ... then it gives you results | |
14:41 | do you think that's what she's looking for? | |
14:41 | thd: what's the additional layer? | |
14:41 | owen | I don't know about her, but I certainly think that is a valuable way to handle subject searches specifically |
14:41 | kados | I still don't know how you would handle the hierarchy though |
14:42 | owen | paul's dictionary search is somewhat similar, but not user-friendly enough for use by the general public, I think |
14:42 | kados | I'm guessing there are many instances like the above when it's not clear |
14:42 | which is the parent which is the child and which are siblings | |
14:43 | maybe the thing to do is get authorities going on koha.liblime.com | |
14:43 | thd: can you help me with that? | |
14:43 | paul_away | (the problem on opac-detail is dashed isbn. |
14:43 | i'll solve it on monday | |
14:43 | kados | thd: you said you had got it going before |
14:43 | paul_away: simple solution is to have another variable for 'amazonisbn' | |
14:43 | paul_away | right. |
14:43 | kados | paul_away: with s/-///g; |
15:05 | thd | kados: well I have been timed out for a while having a conversation with myself. |
15:07 | <thd> kados: Look at the same issue for authors it will help to understand the issue for subjects which I will explain. | |
15:07 | <thd> kados: Try to understand the issue for authors first because it is potentially more confusing for subjects. | |
15:07 | <thd> kados: in every case each field should be searched separately for matches whether they are repeating fields like 650 and 700 or non-repeating like 100. | |
15:08 | <thd> kados: If you toss all the words in 1XX subfields into the same index with the words from 7XX subfields you have matches across field boundaries to a search. | |
15:08 | <thd> kados: Similarly, If you toss all the words in multiple 650 subfields into the same index you have matches across field boundaries to a search. | |
15:08 | <thd> kados: Imagine in our subject heading scheme we have subject headings for science and subject headings for history. | |
15:08 | <thd> kados: Imagine also in our subject heading scheme we have subject subdivision science and subject subdivision history. | |
15:08 | <thd> kados: So we might have 650 $a Science for the most general books about science and 650 $a History for the most general books about history. | |
15:08 | <thd> kados: We might also have 650 $a Science $x History for books about the history of science and 650 $a History $x Science for books about scientific techniques to uncover historical evidence. | |
15:11 | kados: are you with me? The way that authorities work in Koha 2 will not help. | |
15:15 | .me missed the kados goes to get lunch | |
15:16 | kados: ping me when you are back from lunch | |
16:02 | kados | owen-away: you around? |
16:02 | thd: I'm back | |
16:03 | thd | kados: Do you see the posts I reposted after I reconnected? |
16:03 | kados | yep |
16:04 | so if I see a $x I should use that as 'of' whatever is in $a | |
16:04 | ie Science of History | |
16:05 | honestly, does ANY ILS handle that correctly? | |
16:05 | thd | kados: so if you search for Science--History you find History--Science |
16:05 | kados | what would be the 'correct' way to handle it? |
16:05 | ahh | |
16:05 | y'know, I don't think even zebra can do that | |
16:06 | thd | kados: As far as I know most ILS systems handle this correctly |
16:06 | kados | since you can't match across field boundries |
16:06 | thd: could you give me an example? | |
16:07 | thd | kados: you need to search separate indexes that are not all keyed to one mushy jumble. |
16:07 | kados | I'd like to see an opac that does this correctly |
16:08 | so I can get a sense of what folks are expecting | |
16:08 | thd | kados: the old MELVYL system did this correctly and I have copies f the manual describing how the system worked internally. |
16:09 | kados | thd: I'd rather see a live OPAC if you have a link :-) |
16:10 | thd | kadios: any live OPAC should work the same. |
16:10 | owen | Here's a Spydus catalog if you're interested |
16:10 | thd | kados: I only have the internal design for the MELVYL system. |
16:10 | owen | http://spydus.nmit.vic.edu.au/ |
16:13 | thd | kados: Endeavor's voyager works just the same. http://catalog.loc.gov |
16:13 | owen | The same as what? |
16:15 | thd | owen: The same as MELVYL or any other OPAC in restricting fielded searches to matching a single field and not across the boundary of multiple repeated fields. |
16:15 | owen | Can you give an example? |
16:16 | kados | owen: I kinda wish we kept the old spydus system going |
16:16 | owen: then we could compare with the same data | |
16:17 | owen | I know what you mean |
16:17 | ...but not enough to really wish it ;) | |
16:17 | kados | owen: so I click on 'subject' |
16:17 | hehe | |
16:17 | thd | owen: see my example about authors from a couple of hours ago.. |
16:17 | kados | I type in 'history |
16:18 | and I get a list of history headings | |
16:18 | thd | kados: So in the example of an author search for John Smith it should match only if John Smith were in a single 1XX or a single 7XX but will match a book co-authored by Fred Smith in 100 and John Lake in 700 in Koha 2. |
16:18 | kados | this looks like a subject authorities list |
16:18 | thd | owen see above |
16:19 | kados | thd: it's a fundamentally differnt search |
16:19 | thd: in fact, not a search at all ... but a 'browse' | |
16:20 | thd | kados: However, the way that authorities work in Koha 2 will not solve this problem. |
16:20 | kados: It is a search followed by a browse of matches. | |
16:22 | kados: In Koha 2, the authority value is limited to a single subfield. | |
16:22 | kados: paul actually has no plans to change that for 3.0 | |
16:24 | kados: That kills searches for 650 $a $x or anything more complex than one subfield in Koha using authorities. | |
16:25 | kados: so it only works about half of the time with the general population of records in the world. | |
16:25 | kados | ok |
16:25 | so what we need to do | |
16:26 | is build authorities lists based on 650$a and 650$x | |
16:26 | right? | |
16:26 | are there other tags/subfields to worry about too? | |
16:27 | thd | kados: this problem is not strictly an authorities problem. |
16:27 | kados | thd: hang on ... another question if you have a sec |
16:27 | thd | kados: and yes the list of subfields is longer |
16:27 | kados | thd: steven just wrote: |
16:27 | Koha doesn't -- that is, at | |
16:27 | +present, it CAN'T -- make use of the required $6 linking subfield.* | |
16:28 | thd: is that a framework problem or is it true? | |
16:29 | thd | kados; $6 is unmanaged in the poor default framework that comes with Koha now. |
16:29 | kados | thd: lets get it working so I can respond to him |
16:29 | thd | kados: I have switched it to managed. |
16:29 | kados | thd: I'm sick of his bullshit |
16:30 | thd | kados: his comments are correct to the extent of his investigation. |
16:30 | kados | thd: right, but instead of asking how to set up the framework, he frames it in terms of a inability |
16:30 | thd: which makes Koha look bad | |
16:31 | thd: anyway, lets get it working on koha.liblime.com so I can respond to him asap | |
16:31 | thd: this would be used to link two different language records right? | |
16:31 | thd: ie, two records cataloged in a different language? | |
16:31 | thd: which tag would have the $6? | |
16:32 | thd | kados: no other system asks the user to do as much configuration of software to have correct behaviour but we are fixing that. |
16:32 | kados | thd: yep, we are :-) |
16:33 | I assume $6 in tag 800 right? | |
16:34 | should link be 245$a ? | |
16:35 | thd | kados: here is an example |
16:35 | 245 | |
16:35 | 10$6880-03$aSosei to kakō$bNihon Sosei KakōGakkai shi. | |
16:35 | 880 | |
16:35 | 10$6245-03/$1$a[Title in Japanese script]: $b[Subtitle in Japanese script]. | |
16:36 | kados | so 880$6 should be linked to 245 $a right? |
16:37 | thd | kados: each refers back to the other in this example |
16:37 | kados | I see |
16:37 | does Koha's Link feature work well here? | |
16:37 | I'm adding an 880 tag | |
16:37 | is 880 repeatable? | |
16:38 | thd | kados: yes 880 is repeatable |
16:38 | kados | what's a good label for 880? |
16:39 | thd | kados: which Koha link feature are you asking about/ |
16:39 | kados | when editing subfields |
16:39 | I see a 'link' entry | |
16:39 | but I'm not sure what it's for | |
16:39 | thd | kados: 880 ALTERNATE GRAPHIC REPRESENTATION |
16:40 | kados | ok ... thx |
16:40 | so ... valid subfields? | |
16:40 | $6 for starters | |
16:40 | any other? | |
16:41 | $a-z it looks like | |
16:41 | and $0-5, $7-9 | |
16:41 | http://www.loc.gov/marc/biblio[…]c/nlr/nlr8xx.html | |
16:41 | don't know what 'same as associated field' means | |
16:41 | thd | kados: That link feature in the bibliographic frameworks will allow searching from a link and then browsing a list of results returned from the linked field/subfield pairs returned |
16:42 | kados | thd: so the 'link' feature is not what we're looking for? |
16:43 | thd | kados: That link feature functions similarly to how Koha searches authorities except that authority searches will find the authorised value for tracings and references of unauthorised forms |
16:43 | kados | thd: so how do we set up 880 $6 in such a way that I can respond properly to Steven? |
16:44 | thd | kados: Simply allow it to be managed in Koha. |
16:45 | kados: I am sorry that building the framework has taken so long. I did not appreciate how many fields were actually missing from the existing default. | |
16:46 | kados: allow $6 to be managed in koha. | |
16:46 | kados | already have :-) |
16:46 | writing a response now :-) | |
16:46 | thd | kados: I could send you an incomplete SQL file but I have not come to 880 yet. |
16:47 | kados | thd: take your time and get it right ... |
16:52 | thd | kados: I am editing it in vim as I had intended. I started in the user interface until I had a significant set of values with which to work. The user interface seemed a little slow. Tabbing between fields sequentially only worked sometimes, and something is preventing saving repeatability at the same time as creating a new subfield. I fixed one minor user interface bug and then quit using the user interface. |
16:54 | kados: syntax colouring should save me from any careless editing. | |
16:54 | kados: Although I did almost lose 266 lines when I fell asleep at the keyboard last night :) | |
16:56 | kados: everything will be thoroughly double checked. My early work with the user interface problems will have been triple checked. | |
16:57 | kados | thd: sounds very comprehensive :-) |
16:57 | thd | kados: When it is done. If people loose data in Koha the default framework will have nothing to do with it. |
16:57 | kados | excellent! |
16:59 | thd | kados: |
17:01 | kados: We ed a plan for how to get the several different frameworks into existing Koha installations for 2.2.6. | |
17:02 | kados: currently the updater would not do that and you do not want to overwrite the modified frameworks that people may have created for themselves. | |
17:04 | kados: The updater needs to be able to install frameworks into a namespace that will not interfere with existing local frameworks that people should stop using once they have a good set of default frameworks. | |
17:06 | kados: The default frameworks just need a special tag at the beginning of the name to make it very unlikely that they will clobber any frameworks that the user may have made themselves. | |
17:07 | kados: However, the updater will not add any new frameworks as it is designed now. | |
17:08 | kados: are you still there? | |
17:09 | kados | thd: yep |
17:10 | thd: trying to manage the mailing list which bounced my message | |
17:10 | but it's currently timeing out | |
17:10 | arrg :-) | |
17:10 | thd | kados: The koha list is timing out? |
17:10 | kados | thd: the management interface |
17:11 | thd: i think the upgrade path is as follows: | |
17:11 | thd: replicate existing framework as 'local' | |
17:11 | thd: replace default with 'Standard' | |
17:12 | thd: but honestly, I don't think anyone has been messing with their frameworks | |
17:12 | thd: so it probably won't be a problem | |
17:12 | thd: here's a question | |
17:12 | thd: what if I have a 'search also' defined differently in two frameworks | |
17:12 | thd | kados: someone needs to mess with the updater so that any new frameworks can be added. |
17:13 | kados | thd: does Koha just use the default no matter what? |
17:13 | thd: I spoke to paul about that ... he's going to help me to get this working in 2.2.6 before it is released | |
17:13 | thd | kados: that is a question for paul or experimentation. |
17:15 | kados: what was paul's answer about the differently defined see also values? | |
17:15 | kados | thd: I forgot to ask him |
17:18 | thd | kados: I think that you now have to select a framework for editing a record. The frameworks system has no way of determining which framework to apply in advance of your choosing. |
17:19 | kados: If you commit your plugins I can have those already included in the default framework on which I am working. | |
17:21 | kados | thd: ok ... I think they need a bit more work first though |
17:21 | thd: I'd like to get some relationships working | |
17:21 | thd | kados: There is an issue about variance in the holdings fields that does need to be carefully applied when adding new frameworks to existing installations. |
17:23 | kados: Whatever fields and configuration that people are now using for holdings needs to be selected and added to the standard frameworks. | |
17:24 | kados: That may be the most likely point of framework variation. | |
17:24 | kados | thd: I really think it's very rare for those to be adjusted at all |
17:24 | thd: and manual adjustment of those isn't hard anyway | |
17:25 | thd: it's reasonable to expect folks to do that to get the advantages of the new framework | |
17:26 | thd | kados: such unadventurous libraries you have which have otherwise been bold enough to take the Koha adventure ride :) |
17:29 | kados | thd: are you on the koha-win32 list? |
17:30 | thd: check out the latest post by Carol Ku | |
17:30 | thd | kados: I just reminded myself that issues involving $6 identifying which is the correct 880 to reference are one reason why the record editor needs to be able to reorder fields and not just subfields. |
17:30 | kados | thd: it has to do with the $6 |
17:31 | I quote: | |
17:31 | i think the $6 linking field is different from a regular subfield a, b or c | |
17:31 | etc. | |
17:31 | In MARC, all the information on the book will be stored in the native | |
17:31 | language in tag 880. Then they use $6 linking field to tie 880 to tag 100 | |
17:31 | for Name etc... so e.g. 880 $6100 a.... so this tag means information | |
17:31 | stored here is the author name (designated by code $6100) in e.g Chinese. | |
17:31 | $6 is not a regular subfield.... | |
17:31 | thd | kados: I am not on that list but I should be for all the help I have given to Carl off list with getting Koha configured correctly on MS Windows. |
17:31 | kados | thd: so is that a proper use for the 'link' feature in Koha? |
17:34 | thd | kados: Koha has no feature designed to substitute one value for another in the display based upon MARC concepts. |
17:35 | kados | thd: and that's what carol is discussing? |
17:36 | thd: I don't get it ... | |
17:36 | thd | kados: the link feature would allow you to do for other fields what you can already do with '...' in the OPAC search for a few fields in the original Koha SQL tables. |
17:36 | kados | thd: which is what? |
17:38 | thd | kados: the advanced OPAC search page allows you to search for matches returned from biblio.author in the '...' link next to the author search box. |
17:39 | kados: you then choose the author you want from that list. | |
17:39 | kados | right |
17:39 | thd | kados: the ordinary search just fins you all the biblio records matching your search. |
17:40 | s/fins/finds/ | |
17:40 | kados | I understand that |
17:40 | but what does that have to do with 880$6? | |
17:41 | (and incidentally, that search doesn't really do anything since once you find what you're looking for and then actually do the search, it just returns results as it normally would) | |
17:42 | thd | kados: the subfield link feature allows you to control similar behaviour. |
17:43 | kados: I did not appreciate that the end result was the same. Maybe it is not always the same. | |
17:43 | kados | thd: it is :-) |
17:44 | thd | kados: But there is nothing in that feature to Help the OPAC display a special Character set which is what Carol wants. |
17:46 | kados: Carl needs cataloguing in Chinese. She has Koha 2 running in UTF-8 so she can catalogue in Chinese. | |
17:47 | kados: I was corresponding with her a few times a week until the nonsense in New Jersey over $1.25 started to become very serious. | |
17:48 | kados: I have not written back to her to explain my disappearance. | |
17:50 | kados: She wants the OPAC to use $6 to display character encoding according to rules for the language specified in the corresponding 880. | |
17:51 | kados: Koha is not going to do that for her yet because it has no mechanism set up for that. | |
17:52 | kados: That would be late stage character set management. | |
17:53 | kados: i need to get MARC-8 to UTF-8 working before going on to more difficult problems. | |
17:54 | kados | thd: actually, that's already done : http://open-ils.org/blog/?p=14 |
17:54 | thd: :-) | |
17:55 | thd | kados: Assisting Carol is further complicated by my not knowing Chinese and not having the correct glyphs installed for some examples she has sent. |
17:56 | kados | thd: since Carol's system is already utf-8 she shouldn't need to have multiple char sets working on the same page |
17:56 | thd: encoding sets I mean | |
17:56 | she should be able to view chinese and english together | |
17:56 | thd: I took two years of Chinese btw :-) | |
17:56 | thd | kados: yes she can |
17:58 | kados: you need great patience to help her. She is a MS Windows user with very limited experience managing computer systems in a sophisticated manner. | |
17:58 | kados | thd: http://www.loc.gov/marc/biblio[…]c/nlr/nlr8xx.html |
17:59 | thd: it seems that the 880 subfields | |
17:59 | thd: rely on associated fields for their labels | |
17:59 | thd | kados: I have written very careful step by step instructions to her from my memory of how Windows systems work. |
17:59 | kados | thd: I suppose we'll have to just set up very generic labels in the framework |
18:00 | thd: right? | |
18:00 | thd: I still don't get what the difference between 'Link' and 'search also' is | |
18:01 | thd: in Koha's subfields constraints | |
18:04 | thd | kados: character set encoding is not the whole problem because there are language specific rules to how to read and dismay the characters that you have once you have them correctly encoded. |
18:06 | kados: also the issue has to be addressed where remote OPAC the user does not have a system that allows them to enter Unicode. | |
18:06 | kados: That is not a problem for Chinese because Unicode is absolutely required unless it has been romanised. | |
18:09 | kados: There is still the problem in the US and France where legacy environments that remote users may not support Unicode so that translating query strings is necessary. | |
18:12 | kados: I have not really experimented with the frameworks link feature well but I understood what hdl identified as its function which is mentioned in the help file when he tried to explain it to you very briefly. | |
18:14 | kados: I presume that as see also applies to basic searches so link applies to '...' searches. | |
18:14 | kados | thd: how to read and display chars won't affect Chines as chinese unicode is left-to-right as english is |
18:15 | ahh ... the help file ... forgot about that :-) | |
18:16 | thd | kados: Yes, that may be much less of an issue for Chinese than for the languages with more complex rules for how to interpret a set of characters for display. |
18:17 | kados: That help message is not very helpful unless you know what it means already. | |
18:17 | kados | thd: the help seems to indicate you can use the 'link' field to manage 'biblios connected to biblios' |
18:18 | thd: For example, put 011a in 464$x, will find the serial that was previously with this issn. With the 4xx pligin, you get a powerful tool to manage biblios connected to biblios | |
18:18 | thd | kados: That help message is translated from French. |
18:18 | kados | right |
18:18 | I think it could be easily used as a way to show relationships between two records | |
18:20 | thd | kados: $6 is a relationship in one record |
18:20 | kados | right |
18:23 | thd | kados: I see that can be used to find serials across title changes when the ISSN changes as well without even actually using authorities. |
18:23 | kados | right |
18:24 | chris | its actually quite a nice feature |
18:24 | kados | I wish I could get some demos going of this kind of functionality |
18:24 | but I don't understand it well enough from the user's point of view | |
18:24 | incidentally Carol got her Koha going with utf-8 | |
18:24 | thd | kados: well I will include it in my default framework. |
18:24 | kados | thd: examples? |
18:25 | thd | kados: examples of what? |
18:25 | kados | thd: I would like to get authorities working on my demo |
18:25 | thd: as well as the 'link' feature | |
18:26 | thd | kados: Carol had Koha working in UTF-8 long before I helped her thanks to chris |
18:26 | chris | yay me |
18:26 | heheh | |
18:26 | kados | chris: so what was involved ? |
18:27 | chris | changing all the templates |
18:27 | kados | chris: it might be useful for demo purposes for koha.liblime.com to be in utf-8 |
18:27 | chris: so I could throw some chinese in there for instance | |
18:27 | chris | right |
18:27 | if the templates are served as utf-8 | |
18:27 | then it just works | |
18:27 | thd | kados: When I dropped off the map Carol seemed to be having MARC-8 problems for importing records that were not already in UTF-8. |
18:28 | kados | huh ... so what about the probs paul's been having? |
18:28 | chris: and what about the data in the database? | |
18:28 | chris | the data in the database is fine |
18:28 | i think the problems paul is having is when you try to modify the data | |
18:29 | thd | kados: paul has been concerned with 100% UTF-8 compatibility even for keyed values and identifiers. |
18:29 | chris | <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1" /> |
18:29 | change that | |
18:29 | in the templates | |
18:29 | kados | chris: but that's just the meta tag ... what does that have to do with how the content is served up? |
18:30 | chris: isn't that determined by the script? | |
18:30 | chris | that will mean that it will come to the script as utf-8 |
18:30 | when a user submits | |
18:31 | as far as i can remember | |
18:31 | thd | kados: The problem paul has had relate not merely to record content stored but to the keys and identifiers for that content. |
18:32 | chris | the data in the database displays fine |
18:32 | kados | well ok ... opac.liblime.com is utf-8 now :-) |
18:32 | chris | if you have a couple of marc records with utf-8 chars in them |
18:33 | we could try throwing them in | |
18:33 | and then see how they display | |
18:33 | thd | kados: All the French accented characters become multibyte in UTF-8 where they wereonly one byte in ISO-8859. |
18:34 | chris | or some french ones |
18:36 | thd | It is now possible to obtain UTF-8 records from LC which will reduce many MARC-8 problems. |
18:37 | I am not certain how to or if it is yet possible to specify a request for UTF-8 records from the LC Z39.50 server. | |
18:37 | kados | ok both koha.liblime.com and opac.liblime.com are utf-8 now |
18:38 | thd: if you could request by leader position 6 | |
18:38 | thd: you could find out what charset they had | |
18:38 | chris | kados, you could try editing a record |
18:39 | and put some utf-8 chars in | |
18:39 | and see what happens | |
18:39 | kados | chris: you mean do actualy cataloging? *gasp* :-) |
18:39 | chris | heh |
18:39 | unless you have some records with some in | |
18:39 | yeah | |
18:39 | kados | maybe google china? |
18:40 | thd | kados: you can easily find out what the record has it requesting a preferred encoding in advance form multiple options about which I am wondering. |
18:40 | kados: you can obtain fine UTF-8 records readily over z39,50 from Russia. | |
18:43 | chris | hmm thats promising |
18:43 | it doesnt munge it | |
18:44 | when i search on something it shows me the right characters | |
18:44 | cant paste the characters in here tho :) | |
18:44 | kados | I can't seem to get it to copy/paste the cars |
18:45 | chars | |
18:45 | correctly anyway | |
18:45 | thd | kados: the Russian State Library Z39.50 server is at aleph.rsl.ru:9909/rsl01 |
18:47 | chris | http://opac.liblime.com/cgi-bi[…]etail.pl?bib=8264 |
18:47 | kados | chris: so some of the chars won't paste, others will |
18:47 | chris | i just edited this record in koha.liblime.com |
18:47 | it stuck | |
18:47 | kados | nice |
18:47 | chris | and now you can search on those characters |
18:47 | and find that record | |
18:47 | it seems to just work (tm) | |
18:47 | kados | sweet |
18:48 | I need to find a record that's utf-8, and has 880 setup | |
18:48 | chris | right |
18:49 | kados | here we go: |
18:49 | http://ihome.ust.hk/~lblkt/xml/marc3.xml | |
18:49 | bad stylesheet, but view source | |
18:49 | chris | right |
18:49 | kados | no 880 though |
18:49 | but I"ve got the pinyin for that in another record | |
18:49 | chris | yeah they just did what id do |
18:50 | kados | ok ... I'm gonna do a test |
18:50 | chris | catalogue in chinese |
18:50 | k | |
18:53 | ahh that reply from carol makes more sense | |
18:53 | thd | kados: at the Russian State library you can find records in many different languages, but all in UTF-8. |
18:55 | kados | thd: right, but my russian is much worse than my chinese :-) |
18:56 | thd | kados: you can search in any language including English and find records. |
18:57 | kados: I think it may be the second largest library in the world. | |
18:57 | chris | can u search it fro the web thd? |
18:58 | thd | chris: I expect so but I have only used it as a Z39.50 target |
18:59 | chris | ahh ex libris |
19:00 | you can ask me questions about it, but i cant tell you :-) | |
19:02 | thd | chris: you signed a non-disclosure agreement with Ex-Libris? |
19:02 | chris: http://aleph.rsl.ru/ | |
19:03 | chris | yep i did thd |
19:04 | when i did some work for a consortia in colorado | |
19:04 | i kinda got blindsided | |
19:04 | severly jetlagged, not allowed in the building before i signed it | |
19:04 | etc | |
19:05 | thd | chris: can they sue you for improving Koha to match their features? :) |
19:05 | chris | ahh we surpassed them long ago :-) |
19:05 | there are things in koha i dont work on, just so there is no dange | |
19:05 | r | |
19:07 | kados | thd: do you know if 880's subfields are repeatable? |
19:07 | thd: should they all be set up as repeatable fields? | |
19:07 | thd | chris: I have opted to not apply for work at some places that were liable to claim ownership of my own mind for years. |
19:08 | kados yes repeatable | |
19:08 | chris | its a good rule thd |
19:08 | thd | sorry kados misread your question |
19:09 | kados | thd: it seems like 880 is a special case tag |
19:09 | thd: it relies on information from $6 | |
19:09 | thd: to decide how it will 'act' | |
19:09 | chris | ok its a lovely saturday morning here, i might go outside before i become a troglydite |
19:10 | kados | chris: ciao :-) |
19:10 | chris | cya's later |
19:10 | kados | thd: am I correct? |
19:10 | thd | kados: 880 just uses the subfields from the linked field and applies the repeatability from the linked field |
19:11 | have fun in the summer sun chris | |
19:12 | kados: The framework should make all possible subfields available in 880 all subfields repeatable except $6 | |
19:13 | kados | ok |
19:13 | but they can't have meaningful labels | |
19:13 | since they will represent many different associated fields | |
19:14 | thd | kados: just label them 2, 3, 4, a, b c, etc. |
19:16 | kados | thd: incidentially, I hope you're breaking things down into the proper tabs |
19:16 | thd: in your Standard MARC framework | |
19:16 | thd: ie, 0X in tab 0, 1XX in 1, 2XX in 2, etc. | |
19:18 | thd | kados: of course until subfield reordering valid 880s cannot be created if $2 is needed because $6 needs to be the first one, as well as the related issue for repeatability. for repeatability |
19:19 | kados | right ... so for now, we'll have to just not use subfiedsl 1-5 for the editor's sake |
19:19 | thd: but 2.2.6 will have subfields reordering | |
19:19 | thd: as well as repeatability | |
19:19 | thd: in fact, I could probably get that going this weekend | |
19:19 | thd: it's quite trivial | |
19:19 | thd | kados yes I have been reassigning them to numeric tabs based on the first number |
19:19 | kados | excellent! |
19:20 | :-) | |
19:20 | thd: when would subfield 2 be needed in 880? | |
19:21 | thd | kados: All that we would have left to add is MARC-8 support and even Koha 2 can shout its virtues loudly. |
19:24 | kados: If the source of information for the field linked from 80 required identifying that source using $2 | |
19:24 | s/80/880/ | |
19:26 | kados: That would not necessarily be common but $6 is not common until you start working with records for languages outside the easy western European ones. | |
19:31 | kados: actually having a lot of seldom used blank $6 subfields in the record editor will be likely to be seen as undesirable by many. They can always use the frameworks to hide them again but then cannot get them when they need them unless there was something like the pop-up that I suggested to bring up seldom used subfields when needed. | |
19:31 | kados: The same issue also applies to seldom used fields. | |
19:35 | kados: $6 is never repeatable but it should be accessible when the occasion arises for libraries that occasionally need it and always present for libraries that always need it. | |
19:37 | kados: Presently there is no way to support the occasional use of a field or subfield within one framework. | |
19:39 | kados: The record editor could be changed to allow occasionally used fields and subfields to be brought up without cluttering the screen most of the time. Furthermore when editing old records occasionally used fields already populated with values should always appear. | |
19:42 | kados: more parameters to support more options like default record editor subfield grouping and ordering for adding new sequences of subfields; and visibility in the record editor but not in the OPAC could also be added easily. | |
19:53 | kados | thd: I notice that when repeating tags, the subfield order of the repeated tag is sometimes different than the original |
19:54 | thd | kados: what do you mean exactly? |
19:54 | kados | thd: also, if a tag is repeated, but empty, it is still preserved |
19:54 | thd: observe: | |
19:54 | http://opac.liblime.com/cgi-bi[…]tail.pl?bib=23717 | |
19:54 | scroll down to the second 880 | |
19:55 | the first 880 has the correct subfield order: | |
19:55 | 6 | |
19:55 | a | |
19:55 | the second has | |
19:55 | a | |
19:55 | 6 | |
19:55 | c | |
19:55 | very strange | |
19:57 | thd | kados: the second is 942 not 880 |
19:58 | kados: sorry, I think I have the wrong link I only see one populate 880 | |
19:58 | kados | ? |
19:58 | check again, there are two | |
19:59 | wait ... I deleted it | |
19:59 | but notice that they persist though still blank | |
19:59 | thd | kados: you mean two $6 one in 100 and one in 880 |
20:00 | kados: Do you think that the blank 880 $a have a blank pace in them? | |
20:00 | s/pace/space/ | |
20:04 | kados | they didn't |
20:04 | so Koha's saving them for some reason even if there's no data | |
20:04 | In fact, it's saving a bunch of fields for which there is no data | |
20:04 | just look through all the blank fields in that record | |
20:05 | they shouldn't show up because they don't exist in that record | |
20:05 | I'm gonna have a look at the code | |
20:06 | thd | kados: could it be the difference between a null and and empty string? |
20:07 | kados | yea |
20:07 | could be | |
20:07 | thd | kados: I can see odd behaviour in the subfield structure editor where sometimes NULL has been recorded and sometimes an empty string. |
20:09 | kados: I will simplify what I found to one or the other with a global search and replace. | |
20:33 | kados | chris: you happen to be around? |
20:33 | chris: I suspect this is our prob: | |
20:33 | my @tags = $input->param('tag'); | |
20:33 | my @subfields = $input->param('subfield'); | |
20:33 | my @values = $input->param('field_value'); | |
20:34 | thd | kados: chris is playing in the sun |
20:34 | kados | yea, but I thought maybe he snuck back in :-) |
20:35 | thd | kados: paul wrote the lines you are looking at now. |
20:36 | kados | yep |
20:36 | I'm in MARChtml2marc at the moment | |
20:36 | i should be able to fix the null vs blank prob we're having | |
21:07 | yay ... solved that | |
06:16 | osmoze | hello |
← Previous day | Today | Next day → | Search | Index