← Previous day | Today | Next day → | Search | Index
All times shown according to UTC.
Time | Nick | Message |
---|---|---|
00:33 | dcook joined #koha | |
00:41 | ex-parrot joined #koha | |
00:41 | eythian | ex-parrot: welkommen |
00:41 | ex-parrot | hey eythian |
00:41 | wahanui | go back to bed, eythian |
00:41 | ex-parrot | I just came to harass cdickinson |
00:42 | eythian | seems reasonable |
00:42 | ex-parrot | (koha is cool too) |
00:42 | dcook | ^ |
00:51 | Francesca | I don't think cdickinson is around |
00:52 | ex-parrot | I managed to get ahold of him :) |
00:52 | cdickinson | I just got back, Francesca |
00:52 | Francesca | sup |
00:53 | Francesca_ joined #koha | |
00:54 | Francesca_ | ugh mac just crashed |
01:01 | jamesb joined #koha | |
01:11 | liz joined #koha | |
01:33 | eythian | what's the table that stores what permissions borrowers have? |
01:35 | oh | |
01:35 | dcook | hehe |
01:35 | eythian | maybe user_permissions |
01:35 | or maybe not | |
01:36 | it's hard to tell | |
01:36 | dcook | Well, flags in the borrower table, and then I think a mix of.. |
01:36 | eythian | ah |
01:36 | dcook | `permissions` i think is subpermissions? |
01:36 | eythian | I bet it's the flags one I want |
01:36 | dcook | I would think so |
01:36 | And then you compare that against... | |
01:36 | `userflags` | |
01:37 | Right, `userflags` has the permissions... `permissions` has the subpermissions... | |
01:37 | In the best naming scheme ever | |
01:37 | eythian | ikr |
01:37 | dcook | And I think `user_permissions` might store somethihng about subpermissions? |
01:38 | eythian | all I need to see is the users that have any permissions, so it's easy enough. |
01:38 | huh, it's weird seeing the name of someone I used to work with pop up in there. | |
01:38 | oh, she used to work there too | |
01:38 | I'd forgotten about that | |
01:39 | ("there" being the client whose database I'm poking in) | |
01:39 | http://beatonna.tumblr.com/pos[…]s-of-li%C3%A9bana <-- unrelated | |
01:39 | dcook | And people think there isn't any interesting data in library databases |
01:39 | The potential for data misuse seems fairly high to me... | |
01:39 | hehe | |
01:39 | I like that image | |
01:40 | eythian | that's why we don't let people get to the database who aren't supposed to :) |
01:42 | oddly, I'm totally not finding the user I'm expecting to | |
01:43 | maybe it's vanished | |
01:43 | dcook | deletedborrowers? |
01:43 | eythian | nah, I expected it on staging, it's probably been zapped by a database refresh |
02:18 | indradg joined #koha | |
03:08 | dcook | eythian: I was just thinking... in theory Zebra could handle other formats than just MARC, yeah? |
03:08 | The problem would be with Koha | |
03:08 | eythian | well |
03:08 | dcook | That it would try to get MARC out and that wouldn't work for non-MARC |
03:08 | eythian | Probably |
03:08 | but, I don't know if you can mix them up | |
03:09 | however, it's way outside my scope of knowledge of zebra. | |
03:09 | dcook | hehe |
03:09 | Fair enough | |
03:09 | Yeah, it's outside mine as well | |
03:10 | Thinking about all the different challenges we face with supporting different metadata formats... | |
03:11 | Zebra is a free, fast, friendly information management system. It can index records in XML/SGML, MARC, | |
03:11 | e-mail archives and many other formats, and quickly find them using a combination of boolean searching | |
03:11 | and relevance ranking. | |
03:11 | wahanui | relevance ranking is broken by QueryAutoTruncate |
03:11 | dcook | Shh |
03:11 | * dcook | wonders if he was the one who wrote that.. |
03:12 | dcook | But even if you stored multiple things in Zebra... you'd need some sort of way to get them out.. |
03:12 | eythian | yeah |
03:12 | Koha really expects marc formats | |
03:12 | dcook | Yeah |
03:12 | I wonder how Zebra returns thing... | |
03:13 | Does it store the original and its own internal record? | |
03:13 | It certainly stores the latter.. | |
03:13 | If only windows would let me open things.. | |
03:13 | eythian | heh windows |
03:13 | I have no idea how zebra stores things | |
03:13 | dcook | Kill windows with fire.. |
03:13 | ex-parrot left #koha | |
03:14 | eythian | not ex-parrot, windows |
03:14 | zebra is mostly a mysterious black box to me | |
03:14 | dcook | Fair enough |
03:17 | Come on, Zebra, tell me your secrets... | |
03:19 | kathryn joined #koha | |
03:19 | dcook | This install looks funny.. |
03:28 | Ahhhh | |
03:28 | "A parser for binary MARC records based on the ISO2709 library standard is provided, it transforms these | |
03:28 | to the internal MARCXML DOM representation" | |
03:28 | Actually, save the ahhh moment for later.. | |
03:32 | I think I understand everything except how it outputs usmarc.. | |
03:32 | iso2709 rather.. | |
03:38 | It doesn't store iso2709 internally... not even in GRS1 | |
03:38 | GRS1 used it's own internal format.. | |
03:40 | But we only care about DOM now.. | |
04:01 | So with DOM... it'll parse iso2709 into MARCXML... and it just stores MARCXML as MARCXML in the Zebra storage... | |
04:01 | Which is why the identity.xsl will return MARCXML for "elements marc" and "elements marcxml" | |
04:01 | Neato burrito | |
04:03 | In theory you could index whatever into Zebra and get out MARCXML... so long as you could specify a XSLT to make it so | |
04:04 | Although the smarter thing to do would probably be to define some sort of intermediary format... | |
04:04 | Which would require re-doing all the XSLTs for course | |
04:05 | Or would it.. | |
04:06 | You could actually expand the existing detail and result XSLTs... | |
04:06 | Or refactor them to make the better... | |
04:06 | We could have templates for discrete parts of the detail page | |
04:07 | Title template, author, the rest, etc. | |
04:07 | Well... that could be tricky.. | |
04:07 | Might get confusing for developers | |
04:08 | eythian | that's sooorrrrttttaaa what I'm doing with ES |
04:08 | though, it's not really used | |
04:08 | it could be though | |
04:08 | dcook | How do you mean? |
04:09 | eythian | all the fields are converted (mappings) to things like "title" and so on |
04:09 | it's more aimed at searching though | |
04:09 | I don't think putting display elements into the index is a good idea | |
04:09 | I think putting the record in and converting it to display after fetching is better | |
04:09 | dcook | Agreed |
04:10 | eythian | after all, you might want different displays in different situations |
04:10 | dcook | The XSLT for the detail page could check if it's MARCXML or whatever else |
04:10 | Yep | |
04:10 | eythian | I think you would need an XSLT for each format you support |
04:10 | it's the only sane way | |
04:10 | dcook | It would be the easier way :p |
04:10 | eythian | insofar as XSLT is sane |
04:10 | dcook | True that |
04:10 | You could do multiple ones.. | |
04:11 | And have one top level one | |
04:11 | Well...what do I mean.. | |
04:11 | Really one XSLT | |
04:11 | eythian | I suppose you could |
04:11 | dcook | You'd import the other ones in just to get access to their templates |
04:11 | The idea would be to make the search results and detail pages modular | |
04:12 | So the XSLT needs to create certain blocks of HTML | |
04:12 | Title, author, details, etc | |
04:12 | And the way that each metadata format does that is up to its particular template | |
04:12 | Which the top-level template will call depending on the metadata format | |
04:12 | Easy peasy | |
04:12 | eythian | yeah true |
04:12 | good thinking | |
04:12 | dcook | Thanks |
04:13 | eythian | you could even have the XSLT extract parts and put them into XML, then that's rendered using a display XSLT |
04:13 | so you'll have a conversion XSLT for each format, but only one display one for each display situation. | |
04:13 | e.g. results vs. detail vs. staff client | |
04:14 | Alien vs Predator vs Brown vs The Board of Education | |
04:14 | dcook | I'm not sure I entirely follow but I think I agree |
04:14 | Ah, yes, I getcha | |
04:14 | That's another way of doing it | |
04:14 | I think it would be more work in the short-term, but much smarter in the long term | |
04:15 | eythian | little modular lego bricks :) |
04:15 | dcook | And you can specify that extraction XSLT in the Zebra config |
04:15 | Have an intermediate format that you can use for the display XSLT | |
04:15 | eythian | can you do it per record? |
04:15 | dcook | per record type |
04:15 | So yes | |
04:15 | eythian | ah OK |
04:15 | dcook | So I think search could be achievable |
04:16 | And in the short-term... I'm thinking that search is the most important thing | |
04:16 | Because that leads you to external resources or to items | |
04:16 | Items... | |
04:16 | wahanui | i guess items is library-speak for books. |
04:16 | dcook | I suppose items is the next hurdle |
04:17 | (And facets for search) | |
04:17 | Although the Zebra facets would get around that problem.. | |
04:17 | And you could even do facets the way we already do them and hard code them for different formats... which is ugly but doable | |
04:17 | Just need to detect the namespace in the incoming record which is easy enough as well | |
04:17 | Ah but not if we do the intermediary format | |
04:18 | But then if we do the intermediary format, we could use a standard way of getting our own facets out | |
04:18 | Bam! | |
04:18 | Logic all over the place.. | |
04:19 | I suppose any multi-format effort would have to look at search... and item creation... possibly all cataloguing... | |
04:19 | Well maybe not cataloguing. That could maybe be done later. As we could get the records in... | |
04:19 | * dcook | is just babbling now |
04:21 | Francesca joined #koha | |
04:24 | dcook | I wonder how it would work with RDF... |
04:24 | I suppose you can display RDF however you want | |
04:24 | In terms of linked data, you'd just need to publish the original record at the prescribed URI | |
04:24 | (and probably via a SPARQL endpoint) | |
04:25 | And that's something that magnuse is already working on | |
04:25 | was/is* | |
04:25 | I think.. | |
04:30 | Amit_Gupta joined #koha | |
04:36 | dcook | Although I guess with RDF it's much more difficult.. |
04:39 | As sometimes you need to follow the link.. | |
04:53 | Amit_Gupta | hi dcook |
04:53 | dcook | hey Amit |
04:57 | Francesca | hey dcook |
04:57 | dcook | hey Francesca |
04:57 | wahanui | is waging a war against the video driver that continues to fail on her vm |
04:57 | Francesca | lol |
04:58 | not anymore | |
04:58 | how goes aussie | |
04:58 | dcook | Mmm it goes |
04:58 | Reading more about RDF :p | |
04:59 | You? | |
04:59 | wahanui | You are welcome |
05:06 | Francesca | apart from being welcome, I am kinda bored |
05:06 | need a cat | |
05:06 | cats? | |
05:06 | wahanui | The only good cat is a stir-fried cat. |
05:07 | dcook | Bored? |
05:07 | Cats will certainly keep you from being bored | |
05:08 | Francesca | my cats are outside right now |
05:08 | they dont want to cuddle me or entertain me | |
05:09 | dcook | You could figure out how to integrate RDF into Koha :p |
05:20 | Francesca | what is RDF |
05:20 | wahanui | RDF is, like, just spinning round in circles on the water |
05:20 | Francesca | apart from that |
05:20 | dcook | resource description framework |
05:21 | Francesca | and what does it do |
05:21 | dcook | Example: http://dbpedia.org/page/Tim_Berners-Lee |
05:21 | Good question :p | |
05:21 | In theory, it lets you link machine readable resources together | |
05:21 | Francesca | ah |
05:21 | dcook | So look at that web page |
05:22 | Francesca | sorry don't know if I can help with that - interesting but I don't quite understand |
05:22 | dcook | I don't know if I understand quite yet either :p |
05:22 | Francesca | lol |
05:22 | dcook | The idea is that you have lots of datasets out there, and instead of copying the data... you put in a pointer |
05:22 | So for "birthPlace", you put a link to London rather than saying London | |
05:23 | Francesca | huh |
05:23 | dcook | Well, let's use you for an example |
05:23 | Instead of using the plain text of your parents names in your record | |
05:23 | You'd put in the URIs to their records | |
05:23 | Francesca | ok |
05:24 | so kinda just adding all the info in one place via links | |
05:24 | dcook | So if we're trying to view a record on you, the server should parse your record, and realize that your parents are referred to at those URLs |
05:24 | Yep | |
05:24 | So the server would fetch the records for your parents and display their info | |
05:24 | How it decides what info to show... I don't know | |
05:24 | I think that's up to the server... which is where it all falls apart imho | |
05:24 | Francesca | lol |
05:26 | server side I might not be as helpful | |
05:27 | dcook | Well, you could do it via AJAX :p |
05:27 | I just figure your machine has to be programmed with some knowledge of the schema of the remote resource | |
05:28 | I think there might actually be some facility for that with <rdfs:label/> in RDF/XML... | |
05:28 | Which makes sense to me | |
05:28 | Francesca | heh I've never worked in AJAX |
05:28 | dcook | If there is a RDF field to refer to the thing in plain text at its most base level |
05:28 | Well jeez, Francesca! | |
05:29 | Francesca | lol |
05:29 | my work mainly consists of looking at something and going hey can I re-style it | |
05:30 | dcook | That's also good :) |
05:31 | Francesca | got any webpages that need styling |
05:31 | I could do that quite quickly | |
05:31 | dcook | Oh probably but not off the top of my head |
05:32 | Francesca | lol if you think of any you know where to find me |
05:33 | dcook | Sounds good ;) |
05:35 | Francesca | good luck with the RDF |
05:36 | dcook | Thanks |
05:36 | At the moment, I don't know how it could ever be useful for searching | |
05:37 | Browsing for sure | |
05:37 | If you have the record retrieved, it could show you all sorts of interesting links | |
05:37 | But how do you retrieve that record? | |
05:37 | Francesca | yeah |
05:37 | an interesting problem for sure | |
05:37 | dcook | For sure |
05:38 | I suppose you could query your RDF triple store and say who who has a reference to this URL? | |
05:38 | But that's not very user friendly | |
05:39 | Francesca | yeah |
05:51 | dcook | @later tell magnuse Yo, we should chat RDF! |
05:51 | huginn | dcook: The operation succeeded. |
05:53 | Francesca | dcook: if its any comfort I think my job searching is going about as well as the RDF |
05:53 | dcook | :( |
05:53 | That doesn't sound good | |
05:53 | You're still in uni, yeah? | |
05:56 | Francesca | yeah |
05:56 | I was hoping to get an internship this year but things didnt work out | |
05:57 | dcook | :( |
05:59 | Francesca | so now I have to find a summer job because work would be good |
06:32 | saiful joined #koha | |
07:07 | laurence joined #koha | |
07:14 | * magnuse | waves |
07:17 | magnuse | dcook: yeah, let's talk rdf at some point |
07:20 | dcook: have you seen http://wiki.koha-community.org[…]i/Linked_Data_RFC - it's a braindump that is getting to be a few years old, but i think it still sums up my thinking on rdf and koha | |
07:21 | cait joined #koha | |
07:22 | rocio1 joined #koha | |
07:24 | sophie_m joined #koha | |
07:33 | marcelr joined #koha | |
07:33 | marcelr | hi #koha |
07:34 | cait | morning all |
07:34 | bbl | |
07:34 | cait left #koha | |
07:36 | magnuse | hiya marcelr and cait |
07:36 | fridolin joined #koha | |
07:36 | magnuse | ...and matts and fridolin |
07:36 | matts | hi ! |
07:36 | fridolin | bonjour magnuse and all of u |
07:37 | magnuse | @wuner boo |
07:37 | huginn | magnuse: downloading the Perl source |
07:37 | magnuse | @wunder boo |
07:37 | huginn | magnuse: The current temperature in Bodo, Norway is 8.0°C (8:20 AM CET on November 09, 2015). Conditions: Mostly Cloudy. Humidity: 66%. Dew Point: 2.0°C. Windchill: 2.0°C. Pressure: 29.15 in 987 hPa (Falling). |
07:47 | reiveune joined #koha | |
07:49 | reiveune | hello |
07:50 | magnuse | bonjour reiveune |
07:57 | alex_a joined #koha | |
07:57 | alex_a | bonjour |
08:00 | fridolin left #koha | |
08:05 | Francesca joined #koha | |
08:12 | jajm | hi |
08:15 | Francesca | hey jajm |
08:17 | cait joined #koha | |
08:17 | * cait | waves |
08:17 | * Francesca | waves back at cait |
08:17 | cait | hi Francesca :) |
08:18 | Francesca | sup :) |
08:18 | cait | did you have a nice weekend? |
08:28 | fridolin joined #koha | |
08:33 | gaetan_B joined #koha | |
08:33 | gaetan_B | hello |
08:33 | wahanui | hello, gaetan_B |
08:34 | indradg joined #koha | |
08:41 | Francesca joined #koha | |
08:42 | Francesca | cait: yes, good weekend thanks |
08:48 | Amit_Gupta | heya gaetan_B |
08:48 | wahanui | i think gaetan_B is working at Biblibre and did the nice new start page together with asaurat or a fan of icons |
08:50 | gaetan_B | hi Amit_Gupta |
09:00 | Traumatan joined #koha | |
09:03 | wilfrid joined #koha | |
09:17 | rocio1 left #koha | |
09:19 | paul_p joined #koha | |
09:34 | fridolin | I have a problem with debian jessie + indexdata repo, libnet-z3950-zoom-perl can not be installed |
09:34 | because of a dependancy on perlapi-5.18.2, but the version 5.20.0 is installed | |
09:34 | maybe a problem with the packaging by indexdata | |
09:47 | Joubu | hi #koha |
09:47 | cait | hi Joubu! |
10:09 | mveron joined #koha | |
10:09 | mveron | Hi #koha |
10:14 | Joubu | cait: Are you able to recreate the failure on the report tests? |
10:15 | I have just recreated a fresh DB and the tests still does not fail for me | |
10:15 | cait | only the one time |
10:16 | not sure what needs to be done to get it back to the state before... | |
10:16 | it would point to something changing in the database then? | |
10:16 | i can try and load an older dump - but only tonihgt | |
10:17 | i was testing with my 'play' db - so ther are probably some reports | |
10:20 | Joubu | Tomas told us that Jenkins loads a new DB now, so no reports should be there |
10:20 | cait | hm eah |
10:20 | some variable not set the first time? | |
10:20 | I am really not sure why this happens | |
10:48 | Joubu | @later tell tcohen please send me a dump of a DB to recreate the failure on Repots_Guided.t |
10:48 | huginn | Joubu: The operation succeeded. |
11:25 | Traumatan joined #koha | |
11:28 | Amit_Gupta | hi cait |
11:52 | cait | first time patch writer.... someone around for testing? bug 15136 |
11:52 | huginn | 04Bug http://bugs.koha-community.org[…]_bug.cgi?id=15136 enhancement, P5 - low, ---, contact, Needs Signoff , Display item's homebranch in patron's fines list |
12:08 | clrh joined #koha | |
12:09 | tcohen joined #koha | |
12:11 | drojf joined #koha | |
12:11 | drojf | hi #koha |
12:18 | tcohen | morning #koha |
12:19 | Joubu: it is in /home/jenkins/koha_3_20_00.sql.gz | |
12:19 | Joubu | ok thank |
12:26 | tcohen | Joubu: the problem I think is on side effects from previously run tets |
12:26 | Joubu | tcohen: the tests pass with this DB... |
12:28 | tcohen | Joubu: :-( |
12:29 | maybe try to run the previously run tests (on jenkins list) | |
12:30 | Joubu | tcohen: how do you generate the 'fresh' DB on jenkins? |
12:30 | tcohen | Joubu: only the file I mentioned earlier |
12:30 | (3.20.0 + updatedatabase.pl) | |
12:31 | Joubu | tcohen: I got some errors on updatedatabse: DBD::mysql::db do failed: Table 'audio_alerts' already exists [for Statement " |
12:31 | and some otheres | |
12:31 | so it's not a 3.20 db | |
12:31 | tcohen | http://jenkins.koha-community.[…]etedBuild/console |
12:31 | Joubu | hum... |
12:32 | ok I know, I haven't erase the previous DB, so the tables existed | |
12:32 | tcohen | that sounds promising :-D |
12:33 | * tcohen | goes prepare coffee while the kohadevbox fires |
12:33 | * cait | waves |
12:33 | Joubu | ok drop + create + update + prove => ok |
12:33 | tcohen | Joubu: f*ck |
12:34 | on jenkins, the only one failing is GetTopIssues.t, maybe is date-related? | |
12:34 | jajm: were you involved on GetTopIssues.t? | |
12:34 | Joubu | no |
12:35 | ha sorry :) | |
12:35 | jajm | tcohen, yes i think... |
12:36 | tcohen | it's been failing for the last couple weeks |
12:36 | if you have the time, i'd appreciate that you took a look | |
12:37 | jajm | tcohen, where can I find the test output ? |
12:37 | tcohen | http://jenkins.koha-community.[…]er_D7/572/console |
12:37 | jajm | thx |
12:39 | Joubu | jajm: at one point, the AI for biblio and biblioitems are not in sync anymore |
12:39 | jajm | 17:33:39 [10:54:22] t/db_dependent/Circulation/GetTopIssues.t ................. ok 9952 ms |
12:39 | tcohen, ^ | |
12:39 | Joubu | http://jenkins.koha-community.[…]etedBuild/console |
12:39 | 18:28:12 t/db_dependent/Circulation/GetTopIssues.t (Wstat: 65280 Tests: 0 Failed: 0) | |
12:41 | paul_p joined #koha | |
12:41 | tcohen | ah, wrong one |
12:41 | heh | |
12:41 | cait | tcohen: could you foward the latest release notes to my workmail maybe? |
12:41 | tcohen | I think it is TooMany.t's fault |
12:42 | I'm regenerating them anyway | |
12:42 | jajm: http://jenkins.koha-community.[…]etedBuild/console | |
12:42 | it's a problem with the tests | |
12:43 | and I had the test failing without biblionumber-bibioitemnumber divergence | |
12:44 | the problem is the biblioitemnumber retrieved is NULL | |
12:44 | don't ask me how that happens :-D | |
12:46 | jajm: to reproduce what i do is | |
12:47 | vagrant up jessie ; vagrant ssh jessie ; cat /vagrant/scp koha_3_20_00.sql.gz | sudo koha-mysql kohadev | |
12:47 | sudo koha-shell kohadev; cd kohaclone ; perl installer/data/mysql/updatedatabase.pl | |
12:48 | and then if you run prove t/db_dependent/Circulation/GetTopIssues.t it passes | |
12:48 | (run it many times, it works) | |
12:48 | then run prove t/db_dependent/Circulation/TooMany.t prove t/db_dependent/Circulation/GetTopIssues.t => FAIL | |
12:51 | meliss joined #koha | |
12:54 | * magnuse | waves again |
12:55 | JoshB joined #koha | |
12:56 | tcohen | jajm: ok? |
12:57 | drojf joined #koha | |
13:16 | tcohen | jajm: https://theke.io/static/koha_3_20_00.sql.gz |
13:18 | @wunder cordoba, argentina | |
13:18 | huginn | tcohen: The current temperature in Cordoba, Argentina is 29.0°C (10:00 AM ART on November 09, 2015). Conditions: Scattered Clouds. Humidity: 48%. Dew Point: 17.0°C. Pressure: 29.80 in 1009 hPa (Steady). |
13:22 | magnuse | @wunder boo |
13:22 | huginn | magnuse: The current temperature in Bodo, Norway is 8.0°C (1:50 PM CET on November 09, 2015). Conditions: Mostly Cloudy. Humidity: 62%. Dew Point: 1.0°C. Windchill: 4.0°C. Pressure: 29.09 in 985 hPa (Steady). |
13:22 | magnuse | tcohen wins |
13:23 | tcohen | heh |
13:30 | jajm | tcohen, i reproduce the bug, investigating now... :) |
13:30 | tcohen | jajm: AWESOME |
13:38 | Dyrcona joined #koha | |
13:39 | talljoy joined #koha | |
13:40 | jajm | tcohen, is there a bz where i can send a patch or should I create a new one ? |
13:41 | tcohen | there's a bug i think |
13:41 | Joubu: ? | |
13:42 | jajm: fill a new one | |
13:44 | jajm | ok |
13:45 | Joubu | I didn't open it |
13:48 | NateC joined #koha | |
13:48 | jajm | tcohen, http://bugs.koha-community.org[…]_bug.cgi?id=15158 |
13:48 | huginn | 04Bug 15158: minor, P5 - low, ---, julian.maurice, Needs Signoff , t/db_dependent/Circulation/GetTopIssues.t is failing in Jenkins |
13:49 | tcohen | jajm: i owe you a pastis bottle this time |
13:49 | mario joined #koha | |
13:49 | mario | morning |
13:49 | jajm | tcohen, thanks, but i hate pastis... :) |
13:49 | tcohen | ok, so i'll keep the bottle :-P |
13:53 | cait | heh |
14:02 | Dyrcona joined #koha | |
14:03 | nick joined #koha | |
14:03 | pastebot | "tcohen" at 172.16.248.212 pasted "Joubu: the test" (25 lines) at http://paste.koha-community.org/157 |
14:03 | nick joined #koha | |
14:04 | wnickc joined #koha | |
14:04 | Joubu | tcohen: I suspect that it is mysql specific |
14:05 | tcohen | yeah, but that's what the DBIC people told on |
14:05 | s/on/me/ | |
14:07 | Joubu: the other option is to just create two consecutive biblios and check they have consecutive biblionumbers | |
14:07 | what do u think #koha | |
14:07 | Joubu | this is certainly better |
14:13 | pastebot | "tcohen" at 172.16.248.212 pasted "Joubu: it even reads better :-P" (21 lines) at http://paste.koha-community.org/158 |
14:16 | fridolin left #koha | |
14:20 | fridolin joined #koha | |
14:21 | tcohen | bug 15159 |
14:21 | huginn | 04Bug http://bugs.koha-community.org[…]_bug.cgi?id=15159 normal, P5 - low, ---, tomascohen, Needs Signoff , TestBuilder behaviour on AI values should be tested |
14:24 | tcohen | jajm++ |
14:27 | cma joined #koha | |
14:34 | saiful joined #koha | |
14:51 | JoshB joined #koha | |
14:53 | amyk joined #koha | |
15:02 | paul_p joined #koha | |
15:05 | cait | jajm: around? i had a line with pull downs showing up when i had applised the statistic wizard cataloguing patch |
15:05 | no label in front | |
15:06 | it's still there with the patch applied | |
15:06 | the second row | |
15:07 | i have no idea what it does, the code refers to it as cotedigits.... ? | |
15:11 | gaetan_B | in the cirulation rules the "Default checkout limit by patron category" says "For this library, you can edit rules for given itemtypes, regardless of the patron's category." |
15:11 | but http://schema.koha-community.o[…]r_circ_rules.html doesn't store the branch | |
15:12 | is it stored somewhere else or is the interface wrong? | |
15:13 | Traumatan joined #koha | |
15:13 | gaetan_B | ok this is indeed stored elsewhere |
15:17 | fridolin joined #koha | |
15:21 | xarragon | Hmm, I was changing C4::Members::Attributes.pm and tried running ./t/db_dependent/Members_Attributes.t which failed. It fails for current master as well as 3.20.3. Am I doing anythign wrong? |
15:21 | 1/60 Can't call method "default_privacy" on an undefined value at /var/koha/Koha/C4/Members.pm line 750 | |
15:24 | Guess I need to check the test invocation, b/c the file is only 350 lines long | |
15:24 | gaetan_B | cait: "cote" is the french word for "callnumber" maybe a bad translation from jajm ? |
15:25 | xarragon | Ah, I see.. probably need to have PWd be t subdir |
15:25 | jajm | cait, I don't have this row... :/ |
15:27 | tcohen | xarragon: it is not failing in master for me |
15:27 | jajm | cait, ah yes I see... :) |
15:27 | tcohen | xarragon: are you sure you updated your db, etc? |
15:28 | xarragon | tcohen: I think I just invoked it wrong. Just checked the wiki. |
15:29 | tcohen: I jsut called prove ./t/db_dependant/Members_Attributes.t | |
15:30 | cait | jajm: did you find it? was making tea :) |
15:31 | jajm | cait, i don't know exactly what it's supposed to do, but it's in master too, maybe you should file a new bug for this ? |
15:32 | cait | yeah that's what i meant in my comment - old problem.. but gah. :) |
15:32 | not going to force you to fix it ;) | |
15:33 | jajm | cait, it's really nice of you :) |
15:34 | it's there since 2005 ... | |
15:34 | cait | one of the old and weird things we have where probably noone knows what it's supposed to do... ) |
15:52 | bdonnahue joined #koha | |
15:53 | huginn | New commit(s) kohagit: Bug 15158: Fix t/db_dependent/Circulation/GetTopIssues.t <http://git.koha-community.org/[…]84f0d3502ae3153bd> / Bug 14867: userid not generated when defined in BorrowerUnwantedField <http://git.koha-community.org/[…]ad09c17fb7b7b8913> / Bug 14388: Funds should be sorted by budget_code <http://git.koha-community.org/gi |
16:03 | TGoat joined #koha | |
16:06 | tcohen | @later tell marcelr can we fix that "WARNING: You do not have upload_path..." thing? setting a default path and taking care of the installer creating it? |
16:06 | huginn | tcohen: The operation succeeded. |
16:10 | rocio joined #koha | |
16:11 | francharb joined #koha | |
16:38 | laurence left #koha | |
16:46 | pianohacker joined #koha | |
16:46 | pianohacker | hello |
16:51 | saiful joined #koha | |
16:53 | cait | hi pianohacker |
16:53 | fridolin left #koha | |
17:03 | drojf joined #koha | |
17:11 | gaetan_B | bye |
17:17 | reiveune | bye |
17:17 | reiveune left #koha | |
17:18 | cait left #koha | |
17:47 | wnickc | @later tell wizzyrea I redid bug 14739 for 3.18, give it a once over and let me know if you spot any problems |
17:47 | huginn | wnickc: The operation succeeded. |
18:05 | New commit(s) kohagit: Bug 14402: (QA followup) Add notes to usage text about --fees <http://git.koha-community.org/[…]90409de1d30a0ddbd> / Bug 14402: Make purge_zero_balance_fees() delete fees with NULL balance. <http://git.koha-community.org/[…]c181bba8d55c43bc4> / Bug 14402: Add option --fees to /misc/cronjobs/cleanup_database.pl <http:/ | |
18:21 | francharb joined #koha | |
18:32 | BobB joined #koha | |
18:47 | huginn | New commit(s) kohagit: Bug 8064: DBRev 3.21.00.054 <http://git.koha-community.org/[…]08c8f9e4ac80be5f3> / Bug 8064: Fix unit tests for createMergeHash <http://git.koha-community.org/[…]264537cb1bec3ef54> / Bug 8064: Little fix for 003, 005, 008 in MARC21 <http://git.koha-community.org/[…]a53560b411dd44e14 |
19:07 | New commit(s) kohagit: Bug 15036: Do not overwrite complete status in basket ops <http://git.koha-community.org/[…]600015c9be3320bf0> | |
19:10 | magnuse joined #koha | |
19:31 | tcohen | vagatn destroy jessie |
19:31 | oops | |
19:31 | bye #koha | |
20:22 | sophie_m left #koha | |
20:42 | liz joined #koha | |
20:51 | cdickinson joined #koha | |
20:52 | bdonnahue | hey guys is there a way to have koha import mark 21 records in batch? |
20:58 | magnuse joined #koha | |
21:11 | pianohacker | bdonnahue: yes, there's the stage marc records for import under Tools, there's the bulkmarcimport.pl script |
21:18 | bdonnahue | pianohacker: thanks |
21:18 | is there any way to have koha "look up" a marc record from a reputable source like lib of congress and then import it if it is correct? | |
21:21 | rocio_away left #koha | |
21:58 | eythian | bdonnahue: yes, that's Z39.50 under the cataloguing screen |
22:16 | dcook: your email feels a bit stream-of-consciousness :) | |
22:16 | dcook | eythian: Well... you know me |
22:16 | Definitely stream-of-consciousness | |
22:16 | I really did intend it just to be about library size at first | |
22:16 | :p | |
22:16 | eythian | heh |
22:17 | it sure veers off | |
22:17 | dcook | I figure no one is really going to read the whole thing anyway |
22:17 | Yeah, the size thing had to do with database size and I started thinking about how we probably could handle a lot more records than we do | |
22:17 | I mean... I work on other databases with millions upon millions of rows and it's not a big deal | |
22:18 | But it's more complex than that of course | |
22:18 | eythian: I wrote a thing for library school students about how it would be wise to concentrate more on learning than marks | |
22:19 | Closed it with </unsolicited rant> | |
22:19 | Which a friend pointed out could be my middle name :p | |
22:19 | eythian | haha |
22:19 | dcook | Whoops. Wrote eythian in the 40$c |
22:19 | Don't think you're the transcribing agency.. | |
22:19 | pianohacker | ha! |
22:19 | eythian | I am now! |
22:20 | dcook | hehe |
22:40 | burdsjm__ joined #koha |
← Previous day | Today | Next day → | Search | Index