IRC log for #koha, 2008-06-05

← Previous day | Today | Next day → | Search | Index

All times shown according to UTC.

Time Nick Message
12:27 nengard hey all - sys pref question - shouldn't OPACUserCSS default to '' ?  Right now it's defaulting to '0' ... isn't that printing out a 0 somewhere in the code??
12:39 hey all - sys pref question - shouldn't OPACUserCSS default to '' ?  Right now it's defaulting to '0' ... isn't
12:39 owen If so, nengard, then you're right--that's a bug.
12:41 nengard owen - oops - here's my full question
12:41 hey all - sys pref question - shouldn't OPACUserCSS default to '' ?  Right now it's defaulting to '0' ... isn't that printing out a 0 somewhere in the code??
12:41 i am just a bug magnet
12:41 owen I think no, because the template system will interpret "0" the same as "empty."
12:42 So it's a minor bug
12:42 nengard ah - but it should still be an empty string?
12:42 ok
12:51 thd kados ping
12:54 owen kados is on the road, thd, so I don't know if you'll be able to catch him
12:55 thd owen: which road is that?
12:55 owen All of them, I think. He had a huge itinerary
12:58 thd owen: will bug #218 be on your itinerary in future?
12:58 owen 218?
12:59 thd, do you mean 2189?
13:00 thd owen: #2169 I mean. It was not one that I was thinking of my latest koha-devel list posting but kados had asked me to post that bug almost a year ago.
13:01 owen I would like to see Bug 2169 fixed just as much as you, but it's outside of my expertise.
13:01 Bug 2169 has *always* existed, as far as I know.
13:01 thd s/#2\d+/#2189/ # tubule tying today
13:02 owen ?
13:02 thd another always existed
13:02 2189
13:03 owen thd, is the primary issue of 2189 that of database field sizes?
13:04 thd owen: yes it is not a bug in itself but has been an obstacle to fixing bugs by raising the number of changes required
13:06 owen: although one current bug is directly caused by it perhaps
13:07 owen "The only Koha code which restricts the size of organisation codes to my
13:07 knowledge is the hard coding in the templates."
13:07 The templates are generally set up to match field sizes in the databases
13:07 The database is not configured to match form field sizes
13:10 thd owen: I did not look to see where the problem was I only guessed
13:11 owen If there is a problem with the size of database fields they can certainly be changed, but it's necessary to pick a limit
13:12 thd owen: what then restricts the ability to enter more than a given set of characters in a Koha for field?
13:12 owen It's true that the templates may need to be changed to reflect the database field limits, but it's the database that sets the limit, not the templates
13:13 The templates just have a maxlength attribute applied to the input tag
13:14 thd owen: exactly it is the maxlength value which causes the problem
13:14 owen: the maxlength value is hardcoded when it need not be
13:15 owen The maxlength attribute should reflect the maximum number of characters allowed by the database field
13:15 Unless you're finding that the database field allows enough characters but the form field maxlength is smaller
13:17 thd owen: are you saying that maxlenghth is currently read from the database limit for that field in real time?
13:17 owen No
13:18 thd owen: then it is hard coded as I had presumed?
13:18 owen Yes
13:19 thd owen: my suggestion is to calculate it in real time from the database maximum for the particular field
13:19 hdl thd: hi
13:19 nice to see you
13:19 atz thd: that does not sound very reasonable
13:19 owen Why? Because you suspect libraries will be changing the size of their database fields?
13:20 thd hello hdl
13:20 hdl this would really be MUCH proc-intensive.
13:20 thd owen: because it is self validating and allows changes when necessary
13:20 atz the cataloging module in particular would drag incredibly
13:21 it is perfectly valid for a DB field to have capacity greater than the interface makes use of
13:22 thd hdl: well if excessive CPU would be required then we need your idea of auto-building templates from a configuration file
13:23 not that that would happen any time soon
13:23 non-real-time auto-building
13:24 build them whenever the database changes
13:24 atz what fields, specifically, do we think any library would be interested to enlarge?
13:26 thd atz: I mention 2 in bug #2189 as mere examples.  Any where enlarging would suit a local need.
13:26 owen: #2095 may be a currently problematic instance.
13:28 owen: for #2095 maxlength should be no smaller than the maximum size of a MARC record and should not need to be even that small for MARCXML.
13:29 owen thd, your comments are much better addressed to atz and hdl, since you're not talking about a template issue
13:29 thd owen: is 2095 not a template issue?
13:29 atz thd: as a practical matter, I think it would be far easier to address specific problems than to convert wholesale to adding an extra variable for every input field
13:29 owen thd: no, not as far as I know
13:30 thd owen: OK
13:30 atz thd: as I said earlier, it is perfectly valid for the interface to specify a lower limit than the DB field's capacity
13:30 nengard just want to confirm that this sys pref 'RoutingSerials' is talking about serials routing lists?
13:31 hdl nengard: yes
13:31 nengard thanks hdl
13:31 thd atz: the intention was to make Koha easier to customise but why should the interface limit valid data?
13:32 atz because the *limit* could be valid.
13:32 not to mention it keeps your server from getting overrun
13:33 how is this problem critical?
13:33 thd atz: the first example I describe in bug #2189 is a case where any smaller limit would not allow standards conforming data.
13:33 atz the examples cited are either problems that are already fixed, or ones that *might* occur later
13:35 thd atz: yes, as I told owen a few minutes ago it is not a bug in itself but has been an obstacle to fixing bugs by raising the number of changes required
13:35 atz of course revising the structure that broadly would require an enormous number of changes
13:38 for specific fields that we consider more volatile, we might be able to take this approach relatively soon
13:38 thd atz: yes, I presumed that it was easier to make that change once but if the method is too CPU intensive or would require rewriting how the templates used are built for a batch process then I will withdraw the bug to an enhancement for the distant future
13:38 atz: which approach?
13:39 atz the dynamic approach
13:39 thd atz: I see for a small number deemed liable to change
13:39 atz i'm saying we might be able to use this selectively, rather than with every input field
13:39 right
13:42 thd atz: I guess my thinking was only concerned with values which should not be so small that they could not be used for giving a semantically meaningful name in coded form.
13:45 atz: I think that if the few cases of which I am thinking were hard coded at a maximum of 32 characters in the database and the templates you would never see any problem for decades.
13:47 atz thd: that sounds more feasible
13:47 thd atz hdl: if the templates are not the cause of bug #2095 then what would be?
13:47 atz thd: or in the case of framework code and even organization code, perhaps all that is needed is an additional "description" field to allow the longer te
13:47 *text
13:49 thd: i don't know about #2095.  that seems odd.  can you give a URL?
13:50 or nengard, perhaps you can elaborate?
13:50 nengard atz - off to read it
13:51 atz 134 is a weird number indeed
13:51 hdl thd : Some notes fields in UNIMARC are Textarea notes and not input fields.
13:51 nengard what's the question?  when I entered in the 520 field it stopped me from typing after 134 characters - i was in the cataloging module when doing this
13:51 hdl Would that be enough for you ?
13:52 thd atz: there is one description field already for the frameworks but it shows what the librarian wants to see and not what they hope not to no about what the framework.  A framework memonic is so the person editing the frameworks can be reminded what he is really editing.
13:52 atz nengard: can you give a URL so I know what page to look at?
13:52 nengard atz koha/cataloguing/addbiblio.pl?frameworkcode=BKS
13:53 thd nengard: I did not see this bug myself.  Which Koha editor does it affect?
13:53 nengard thd - maybe it cut it off after i submitted it - or maybe it has been fixed
13:53 it has been a while - let me go try to edit a record ....
13:54 thd nengard: no I mean that I did not test it
13:54 nengard oh
13:55 thd nengard: I see by the URL that you mean the guided forms based record editor not the newer biblios record editor.
13:56 hdl atz : it is quite puzzling for me that C4::Context could use C4::Context->preference in its BEGIN block.
13:56 Is it allowed ?*
13:56 owen There are several instances of 'maxlength="255"' in addbiblio.pl, which I assume correspond to database field limits
13:56 nengard thd hdl atz - i just tested in the newest version i have and it is not limited
13:57 i got over 400 characters into the field when editing
13:57 atz hdl: it is an unusual exceptional case
13:57 nengard here's the edit URL: koha/cataloguing/addbiblio.pl​?biblionumber=263&op=
13:57 atz see what is happening is that the sub "handle_errors" is being defined
13:57 thd owen atz: most subfields have no limits defined in the MARC standards and should not be limited in any significant way.
13:57 nengard thd++
13:58 atz that *sub* uses ->preference, but the sub itself will not be called until after the BEGIN block.
13:59 thd owen: what record editor fields are limited to 255 for records?  That is too big for any fixed length fields and too small for anything else by the standard at least.
13:59 atz hdl: does that make sense re: Context?
13:59 hdl atz : asking it because on SUN it threw me out.
14:00 atz hdl: interesting
14:00 what version of perl?
14:00 hdl 5.8.5
14:00 mc hello all
14:01 thd hello mc
14:01 hdl thd : mc is a BibLibre man.
14:01 owen thd: view source on the Add MARC page to see
14:02 hdl mc: thd is american and worked on MARC21 frameworks. he is really helpful in general use cases.
14:02 mc thx, hdl
14:03 thd mc: do you live in France?
14:03 mc thd, yep
14:03 strasbourg
14:03 near germany
14:03 (right at the frontier, in fact)
14:04 hdl: peraps it's time for me to read the perldoc again but BEGIN block is executed asap
14:04 when bytecodinf
14:04 bytecoding
14:05 so you can't use C4::Context in the C4::Context's BEGIN
14:05 hdl atz ?
14:06 atz sorry, on conf. call..... bbl
14:07 mc hdl: i just verified with a simple code
14:07 and it is!
14:08 hdl Can you detail ?
14:08 it is what ?
14:12 mc hdl, are you talking about this line :
14:12 30                                 my $debug_level =  C4::Context->preference("DebugLevel");
14:14 hdl yes
14:17 thd nengard: how did you fit 400 characters into a field defined as maxlength 255?
14:17 nengard LOL
14:17 no idea :)
14:17 but it worked
14:17 .....
14:18 504 -BIBLIOGRAPHY, ETC. NOTE
14:18  a Bibliography, etc Includes bibliographical references and indexes.
14:18 520 -SUMMARY, ETC.
14:18  a Summary, etc Learning to Teach in an Age of Accountability, by Arthur Costigan and Margaret Smith Crocco, with Karen Kepler Zumwalt, documents the voices of many new teachers -- important voices, articulate voices, emotional voices. In an age of accountability, these voices bring to light many significant struggles, tensions, and conundrums that exist for them as they enter middle and secondary school environments. ?Teaching Educ
14:18 ation
14:18 this  - after i hit submit
14:19 owen If it's a textarea, then there isn't a maxlength
14:19 maxlength only applies to input type="text"
14:19 paul nengard: in MARC editor, there is an automatic mechanism that transform a input into a textarea when subfield size is >200
14:19 thd owen: I see maxlength 255 everywhere
14:19 paul + all notes fields are automatically set to textarea
14:20 thd paul: I see 255 for some notes fields at least
14:20 nengard paul - i did notice that when adding a new record the field was a text field and when editing it was textarea ...
14:21 is there anyway to make it always a textarea?
14:31 paul nengard: what do you mean by "it" ?
14:32 nengard sorry - 520 $a on addbiblio.pl
14:32 oh - i was editing a biblio
14:32 paul - let me start over
14:32 owen You could even auto-resize your textareas: http://zivotdesign.com/example[…]ery/textarea.html
14:33 nengard when viewing the 520a field when adding a new record on addbiblio it was a text field when editing a biblio it was a textarea
14:35 hdl owen : COOOL ;)
14:36 owen jquery++ :)
14:36 thd owen: those 255 limits to the extent I have seen them would not be enough on some rare occasions like lengthy statements of responsibility or the longest narrative titles given to some 18th century books.
14:37 paul one day, /me will find some time to investigate jquery a lot...
14:38 thd owen: since the cataloguer should know enough not to abuse the system 255 should be something like text but not text area.
14:39 s/know/be trusted/
14:42 owen: no arbitrary limit like 255 should ever allow any possible real world record to have truncated subfields even if no one writes long 18th century book titles when writing books today.
14:47 nengard thd - i've seen those long titles today :) hehe
14:48 thd owen: adding a 0 to 255 would be enough but in case some note subfield was accidentally missed 99999 is the maximum size of a MARC record and certainly safe where 255 would not be.  We should never have a risk of data loss merely by saving a record into the editor.
14:51 nengard: I am waiting for the eighteenth century enlightenment to come back into full fashion so we can get out of the neo dark ages.
14:52 nengard heh
14:57 owen I'm waiting for all new nonfiction titles to stop following the same pattern: "BIG WORD!: how blah blah blah and more blah blah and even more and more buy my book"
14:58 gmcharlt owen: it's a requirement of PhD to sign a blood oath to entitle all monographs that way
14:59 hdl lol
15:03 nengard ehe
15:04 thd The rule for determining which subfields should be notes subfields and therefore textarea is much too simple.  A frameworks value would be required to do it correctly.  Only the most important ones have been caught.
15:05 owen: I propose that 255 be changed to 99999 to avoid a data loss risk.
15:05 ... for addbiblio.pl .
15:07 owen Since no one would want to edit 99999 characters from a single-line text input field, the correct solution would be to convert all to textareas
15:08 thd owen: 99999 is merely the theoretical maximum for the standard :)
15:09 owen But my point stands: I wouldn't even want to have to edit 255 characters from a single line input field
15:10 thd owen: I guess you are right that if you were cataloguing those very long book titles you would want it to become textarea so you could see what you would be doing.
15:11 owen Which is why it might be good to convert all inputs to single-row textareas and make them automatically resizeable
15:11 thd owen: as long as those are only textarea when the transition point has been passed then making everything textarea should be fine.
15:13 atz lol gmcharlt : you know, sometimes I want to escape from the koha list, too.
15:13 gmcharlt atz: heh - at least we're paid, in effect, to be on it
15:13 ;)
15:14 thd being unpaid: I sadly no longer read the koha list
15:15 owen Generally the koha list can be considered the koha-install list
15:16 paul hi gmcharlt. Pls note my comment on bug 2173
15:16 gmcharlt paul: I've seen the patch - looks OK, but I want to test - hopefully later today
15:17 paul you've seen the patch ?
15:17 but hdl send it to me only. hdl, you send it to gmcharlt as well ?
15:17 thd For a period in 2005 tried to give a fulsome answer to every koha list question without being paid but that could not last.
15:18 gmcharlt paul: I thought it was the "restoring startsby patch on authorities" - is that the one?
15:18 paul nope.
15:18 it's the "heading-main" search one (search on heading $a)
15:18 gmcharlt thd: flattering Koha users? nice ;)
15:24 ryan gmcharlt: i just noticed yesterday that the '$a only' search does in fact include the entire heading.
15:25 i agree it would be useful for that to work as expected.
15:25 hdl ryan : to reenable it, a new index should be created.
15:26 I reenabled it on UNIMARC.
15:26 thd gmcharlt: how would 2173 be fixed for MARC 21 without searching against special local use field where names were last names would be separated like they are in UNUMARC?
15:26 hdl But would not make it on USMARC since I donot master those DOM files enough.
15:27 thd s/names were//
15:30 gmcharlt thd: simplest would be to make it have a different meaning - search for base heading and ignore subdivisions - although that would not be much different from a left-anchored heading search
15:31 thd: parsing first and last names is not readily supportable in MARC21, of course
15:31 thd gmcharlt: I see yes, left anchored would work for MARC 21.
15:32 gmcharlt: The comma is there for MARC 21 so last names could be parsed into a local use subfield for special treatment if needed.
15:33 gmcharlt thd: comma is only a half measure, alas - far too many names, royal titles, and so on, would be edge cases
15:33 thd: on other hand, I'm not sure that a first name search is necessarily needed, at least for staff and cataloging purposes
15:34 thd gmcharlt: Titles etc. already have their own subfields in both MARC 21 and UNIMARC.
15:35 gmcharlt thd: 100  0#$aGustaf$bV,$cKing of Sweden,$d1858-1950 - where's the comma to identify the first name, e.g.,
15:36 thd gmcharlt: Is the same code used for finding the authorised form to use in the OPAC when invoking the [...].
15:36 ryan i was referring, btw, only to the fact that if I search on King in the field marked '$a only' , i'll get gmcharlt's example there in my results.
15:37 i don't see a need to add further indexes on subsets of $a.
15:37 thd gmcharlt: in that case by AACR2 the first name is the most important and the last name is seldom known by searchers.
15:39 ryan: If that was the issue then left anchoring on the whole field for names should solve the problem
15:41 ryan Ah, I see.
15:42 The template then needs to be reworded if that's the solution.
15:43 hdl for UNIMARC it is not a solution thogh.
15:43 Our users wants to be able to search on Main Entry.
15:44 But I already posted a patch for that on our sied.
15:44 I can send it to you if you want to have a look at it.
15:45 gmcharlt thd, ryan, hdl: left-anchored is enough for MARC21, so can make searchboxes different for UNIMARC and MARC21
15:45 ryan yes, I follow now.  I haven't looked at the dom indexing definition yet.
15:45 gmcharlt hdl: please send me the patch if you haven't already
15:45 paul gmcharlt: I did it 10mn ago ;-)
15:45 gmcharlt paul: ok, it's *that* one ;)
15:45 ryan gmcharlt: i'm fine with that approach.  as long as the interface does what it says i'm happy :)
15:45 paul (although it's a fw: maybe a direct mail would be easier to deal with)
15:46 gmcharlt paul: no, it's fine, git-am is smart enough to deal with it
15:47 hdl ryan : left anchored search would be search starts by
15:47 I sent a patch for it.
15:53 paul tinaburger: are you really here, or is it an auto-connect ?
15:53 acmoore be not afraid, it is her!
15:58 hdl hope you are getting better.
17:40 thd gmcharlt: what is the last message that you have from the koha-devel list?
17:42 atz: are you here?
17:53 danny hello #koha
17:58 atz i'm here... had to reset for OS update
18:02 thd atz: what is the last message which you have from the koha-devel list?
18:03 atz Thomas' reply to Koha 3.0 Stable Release Plan
18:04 thd atz: why do I not have it.  I checked my subscription on Savannah and it seems fine.
18:05 atz you may have to ask your email provider.
18:06 thd atz: is a LibLime or Katipo mail system involved for koha-devel?
18:09 atz: I triggered the alert for the koha users list because I do not subscribe to Koha users with the same address and hope I have not banned myself from receiving on the koha-devel list in consequence.
18:09 The Koha users list was in the CC line from Joshua originally.
18:09 atz thd:  lists.koha.org => 212.85.152.90
18:10 nslookup 212.85.152.90
18:10 Server:         72.36.190.2
18:10 Address:        72.36.190.2#53
18:10 Non-authoritative answer:
18:10 90.152.85.212.in-addr.arpa      canonical name = 90.152.85.212.rev-sql.lost-oasis.net.
18:10 90.152.85.212.rev-sql.lost-oasis.net    name = paulpoulain.com.
18:10 so I'm guessing paul is the guy to ask on that
18:11 thd atz: OK so if I have nothing more by Tomorrow morning I will ask paul.
18:24 hdl thd you can ask me.
18:25 thd : I recieved you mails.
18:48 thd hdl: The problem is that I did not receive my usual copy of y own message for the last two in reply to paul and slef
18:48 s/ y/ my/
18:50 hdl: mailman had caused problems for me about two years ago
18:51 hdl: two years ago mailman said that I was subscribed but had stopped sending me messages.  I had to have kados fix my account at the time.
19:16 dkg i'm trying to get koha running against perl 5.10 with CGI::Session 4.30 (debian lenny)
19:16 The installer is giving me this message:
19:16 Can't locate object method "generate_id" via package "CGI::Session::ID::" (perhaps you forgot to load "CGI::Session::ID::"?)
19:16 (i'm using koha via git, up-to-date as of this morning)
19:17 looking through the code, the only invocations of CGI::Session() use serializer:yaml.
19:18 I can coax a test perl file to fail with the same message just by creating a CGI::Session object with serializer:yaml, also, though it doesn't fail if i omit mention of a serializer.
19:18 any thoughts about what next steps to take for the debugging?
19:19 gmcharlt dkg: sounds similar to issue reported for OpenSUSE
19:19 dkg: try add Perl modules CGI::Simple and FreezeThaw
19:19 dkg: also, how did you install CGI::Session - was it from CPAN or from libcgi-session-perl
19:21 dkg libcgi-session-perl
19:21 i'll try those modules, though.
19:23 hrm.  i added those modules, but i'm still getting the same error with my test program.
19:23 and the same error with the koha installer, too.
19:24 atz dkg: now go back and make CGI::Session again, iirc
19:25 dkg atz: i installed via aptitude.  should i re-install?
19:26 atz hmm... not sure there
19:27 dkg i did "aptitude reinstall libcgi-session-perl", and my test program is still failing, so that's not it.
19:27 gmcharlt hmm - etch has 4.15 of CGI::Session, lenny has 4.30 - perhaps 4.30 has an incompatible API change
19:29 dkg if i modify my test program to omit the ;serializer:yaml, it seems to work fine.
19:29 hdl dkg : I coped with that on SUN.
19:29 gmcharlt dkg: instead of that, try adding ";id:md5" after the serializer:yaml bit
19:29 per patch hdl filed today
19:29 hdl This can be fixed with id:md5
19:30 Consider adding use CGI::Session::ID::md5 to CGI::Session if it is not enough.
19:30 atz yeah, i saw that patch.  seems interesting.
19:31 hdl atz : Indeed, on SUN it was vital.
19:31 dkg if i append that to my test program, i get this error:
19:31 Can't locate object method "generate_id" via package "CGI::Session::ID::md5" (perhaps you forgot to load "CGI::Session::ID::md5"?) at /usr/share/perl5/CGI/Session.pm line 74.
19:32 but if i add "use CGI::Session::ID::md5;" to the top of the file, it completes
19:32 (with a later error message of:
19:33 (in cleanup) Can't locate object method "freeze" via package "CGI::Session::Serialize::yaml" at /usr/share/perl5/CGI/Session.pm line 241
19:33 )
19:38 hdl: do you think that this represents a bug in CGI::Session?  Should we report it to the module authors?
19:38 gmcharlt dkg: if you then add a "use CGI::Session::Serialize::freezethaw" does it help
19:38 dkg: I think it is either an API change, a bug possibly related to Perl 5.10, or perhaps a Debian packaging glitch
19:39 hdl dkg : First, I was surpirsed that CGI::Session on CPAN was an unauthorised version.
19:39 surprised even
19:40 Second, I surely consider those things as bugs.
19:40 dkg gmcharlt: adding "use CGI::Session::Serialize::freezethaw;" did not get rid of the "(in cleanup) Can't locate object..." error message in my test script.
19:40 hdl But I donot know what is the reason for those bugs.
19:40 gmcharlt dkg or hdl: could you file a bug at bugs.koha.org for this
19:41 hdl Look at CGI::Session::Serialize::yaml
19:41 gmcharlt it looks like it will need some time to investigate
19:41 hdl And find if there is freeze method in this Module.
19:42 will do gm
19:44 done.
19:46 dkg http://bugs.koha.org/cgi-bin/b[…]w_bug.cgi?id=2216
19:46 thanks, hdl.
20:07 I just added a patch against the git head to bug 2216 that works for me to resolve the base issue (i still don't know about the "(in cleanup)" error message)
20:08 hdl dkg : i think that that patch had just been pushed today or will soon.
20:11 dkg hdl: excellent.
20:18 OK, so while the upgrade now works, no user session lasts more than one pageview.  I'm assuming this is because the CGI::Session object isn't properly serializing data because of the "Can't locate object..." errors.
20:18 where do i find CGI::Session::Serialize::yaml ?
20:18 it's not in /usr/share/perl5/CGI/Session/Serialize, which is where i'd usually expect it :/
20:20 gmcharlt it would be /usr/share/perl5/CGI/Session/Serialize/yaml.pm - available in 4.14 (etch package) but not 4.30 (lenny package)
20:22 dkg: ok, from CPAN changelog for CGI::Session, CGI::Session::Serialize::yaml was split out as separate module
20:22 so you'll need to get it from CPAN
20:23 atz gmcharlt: so we'll have to add that as an explicit, separate dependency
20:23 gmcharlt atz: yes, and ask Vincent to package it
21:41 dkg i've filed an RFP for the package in debian, fwiw:
21:41 http://bugs.debian.org/484556
21:42 i don't know who Vincent is, but if he plans on packaging it, he can close that bug in the first changelog.
21:45 gmcharlt dkg++
21:48 dkg (using a package built with dh-make-perl of CGI::Session::Serialize::yaml does appear to fix the problem i was having).
21:48 one more annoying question for the day:
21:49 what's the best way to do an import of 10000 marc records with roughly the same number of items?
21:49 can i do this server-side, or do i need to do it through the web ui?
21:49 gmcharlt dkg: server-side, you can use misc/migration_tools/bulkmarcimport.pl (to load in one go)
21:50 or misc/stage_biblios_file.pl and misc/commit_biblios_file.pl if you're adding records to a database that already has some and you need to handle duplicate bibs
21:50 dkg ok, i think that makes sense.
21:50 gmcharlt if you're using Zebra as the indexing engine, you'll then need to follow up with a rebuild_zebra.pl -b -z
21:51 dkg is there a way to reject staged records, or do they hang around forever until you commit them?
21:51 gmcharlt the staged records stick around
21:51 dkg (for instance, if you realize something went wrong after staging but before commit)
21:51 gmcharlt a delete button will be added eventually in the web UI
21:51 but you can also do it from using SQL: delete from import_batches where import_batch_id = N
21:52 dkg ah, cool.  that's great.  and that'll have no nasty side effects down the line?
21:52 gmcharlt no side effects to deleting a staged batch pre-commit
21:52 dkg thank you.
21:52 gmcharlt post-commit, main effect is that you can't run the tool to undo the import
21:53 dkg I see.
21:53 gmcharlt but if you're doing an initial import, that's OK - bulkmarcimport.pl -d deletes all of the bibs and items if you need to start from scratch
21:54 dkg OK, you all were a huge help for me today.  Thank you very much.
21:54 gmcharlt you're welcome
21:54 dkg if you're ever in NYC, i'll get you a beer (or other drink if you don't drink beer)!
21:55 gmcharlt heh - thanks!
22:37 hdl_laptop owen around ?
22:48 owen : when you read this, I was just wondering how I could get a function in Jquery used both on load and on change.
23:18 atz hdl_laptop:   $(document).ready(function() { ... your code here ...; });  
23:18 then
23:21 $(document).ready(function() {
23:21    $('.ajax_buttons' ).css({visibility:"visible"});
23:21    $('body').click(function(event) {
23:21           if ($(event.target).is('.ok')) {
23:21                   $.ajax({
23:21                        "data": {ok: $(event.target).attr("title"), CGISESSID: readCookie('CGISESSID')},
23:21                         "success": count_approve // success_approve
23:21                    });
23:21                    return false;   // cancel submit
23:21           }
23:21 }
23:21 so that would set up an action when the page is ready (unhiding js-dependent buttons)
23:22 and also establish what would happen when a click happens on a target that has class="ok"
23:22 this example is simplified from koha-tmpl/intranet-tmpl/prog​/en/modules/tags/review.tmpl
23:23 hdl_laptop: does that answer your question?
23:29 hdl_laptop partially.
23:29 atz: I would have liked to be able to add
23:31 $(frequency).change(function(){myfunction;}
23:31 $(frequency).load(function(){myfunction;}
23:32 add function to both change and load.
23:32 If there was.
23:33 seems that what you suggests suits your use. But mine is quite different.
23:34 I have 2 selection fields : frequency and number patterns.
23:35 And I would like to load values on load, if there are some.
23:35 But maybe it is just daydreaming.
23:35 (it is 1.30 here...)
23:50 atz when you say "load" you mean when the page is first rendered
23:52 in jQuery, $(document).ready means when the DOM has been built and is ready to be manipulated
23:52 if you have id="frequency" on your input element, you could use:
23:53 $("#frequency")....
23:56 change docs: http://docs.jquery.com/Events/change
01:41 owen Hi cnighs
01:43 cnighs hi owen
02:09 masonj_ heya chris ;)

← Previous day | Today | Next day → | Search | Index

koha1