IRC log for #koha, 2017-04-18

← Previous day | Today | Next day → | Search | Index

All times shown according to UTC.

Time Nick Message
00:00 BobB joined #koha
00:01 aleisha_ joined #koha
00:04 aleisha_ hi all
00:04 kidclamp hi aleisha_
00:04 wizzyrea hi aleisha
00:04 aleisha_ left #koha
00:04 aleisha_ joined #koha
00:05 aleisha_ hows it going kidclamp
00:05 kidclamp pretty good, how's about you?
00:07 irma joined #koha
00:08 * dcook waves to folk
00:09 JoshB joined #koha
00:10 kholt joined #koha
00:12 aleisha_ good thanks!
00:14 papa joined #koha
00:18 dcook kidclamp: Looking at an old blog post I made about linked data...
00:18 And the two records I've linked to are already dead links :p
00:19 The post is from November 2015, so I guess it's not super recent..
00:19 kidclamp hah, that is a problem
00:19 I need to reply to your post(s) by the way
00:19 dcook I need to keep researching I think
00:19 kidclamp heh, as do we all likely
00:20 two things quick 1- my big concern is just that we make it easy to switch backends
00:20 2 - we don't have to resolve our urls, it just seems way cooler to do so :-0)
00:20 :-)
00:21 dcook hehe
00:21 Yeah, I want it easy to switch backends as well
00:21 Although I figure until we know what we want to do with the backend...
00:21 It might make it difficult to create an API for the backend
00:21 I still find updates so... troubling
00:22 Although maybe it's not so different from relational databases..
00:22 I find all the interconnections a bit mind boggling
00:22 Like... sure a URI can replace an ID in a relational database
00:22 But then what about all the triples that are linked from that URI..
00:22 There's no "Delete: Cascade"
00:23 kidclamp I was at code4lib and talked to the Boston Public Library guys who are doing a lot of work, they implemented everything via SQL - any type was table
00:23 dcook Nor should there be I suppose... but :S
00:23 kidclamp it is definitely difficult
00:23 dcook any type?
00:23 What do you mean by "any type was table"?
00:23 * kidclamp gets very bad at words when thinking talking RDF
00:24 dcook I suppose I'm OK with using a relational database for a triplestore... unless it introduces pecularities that make it impossible to move to a native triplestore
00:24 But I suppose if we use enough layers of abstraction, we shouldn't have to worry about that
00:24 kidclamp: Have I shown you this link? http://linkeddatabook.com/editions/1.0/#htoc84
00:24 kidclamp any node that was a 'type' I think I mean class?
00:24 dcook It was suggested to me by..
00:25 caboose joined #koha
00:25 kidclamp basically if they had a bunch of things, each type of thing was a table
00:25 kmlussier joined #koha
00:25 dcook This person: http://www.meanboyfriend.com/overdue_ideas/about/
00:25 Hmm still not sure I follow
00:25 Are you meaning like auth, bib, hold or a different type of "type"?
00:25 kidclamp RDF type/class
00:25 dcook Or like "Work", "Instance", "Person"
00:26 kidclamp that^
00:26 dcook Ah yeah I get you now
00:26 Yeah, that's another thing I've been thinking about
00:26 Which makes it so much harder I think.. haha
00:29 That's the thing I hate about RDF... it's so loose
00:29 Although little bit of trivia...
00:29 https://en.wikipedia.org/wiki/[…]Metadata_Platform
00:29 XMP sidecar files use RDF
00:29 kidclamp it feels like too much was done without enough thought on how it would actually work
00:29 dcook So if you're using Lightroom or Darktable or whatever... you're using RDF O_O
00:29 kidclamp: I'm so in agreement there
00:30 Or maybe libraries are misunderstanding it all
00:30 Or I'm misunderstanding it all haha
00:30 It doesn't seem like it's really meant to be interoperable per se...
00:31 But that you can point to a thing and say "Yeah... that's the thing I'm talking about"
00:31 Like that OCLC example..
00:31 You can say "Oh, I'm talking about http://dbpedia.org/resource/London"
00:31 And we all know you're talking about London, England once we've followed the links
00:31 But other than that...
00:31 it still seems like you need to do some work locally
00:31 And potentially hardcode a lot of things... or make complex mappings
00:32 kidclamp yeah, you need to index the terms you are linking too if you want to search at all
00:32 dcook search/display... yeah
00:32 Then keeping that up-to-date...
00:32 I think the keeping it up-to-date thing is what I struggle with most
00:33 kidclamp caching and regular crawling seems to be the only theory
00:33 dcook Yeah, I'd like to see some real life examples of it though
00:33 Especially since for updates, you need to delete your existing cache
00:33 Although I guess caches are supposed to be throwaway..
00:34 I feel like if I could just find one good production example... I'd have more confidence in the whole thing
00:34 I wouldn't really want to repeat the Zebra debacle
00:35 I mean... Zebra works but I don't think they quite knew how it worked at the start
00:36 kidclamp I think the more it is implemented the more different things will be done though
00:37 dcook Linked Data?
00:37 Seems like
00:37 kidclamp like ElasticSearch, we are using it for searching - that is the least of the things people use it for now
00:37 dcook Yeah?
00:37 wahanui hmmm... Yeah is there a good way to fix that to get the new one running?
00:37 dcook I admit I haven't kept up with ES too much
00:37 I mostly see ads for it in conjunction with other modules though yeah
00:39 kidclamp lots of statistics and logging stuff
00:40 dcook Mmm I'd seen the logging thing. Is that with Kibana?
00:40 I met someone from ElasticSearch a couple years ago and meant to stay in touch but just been too busy :/
00:40 kidclamp I like the 'percolation' feature - using a search of existing documents to classify a new document
00:40 yeah, es
00:40 wahanui i heard yeah, es was back working. Refactoring it to work with authorities at the moment
00:40 kidclamp es + kibana is most common, because it works out of the box with no config basically
00:40 you don't have to build a search enginge :-)
00:41 dcook :D
00:41 I do wonder sometimes though how sophisticated these searches are though
00:41 Mind you, there's the whole "full text indexing" is all you need
00:41 I haven't looked at Kibana though so I'm just rambling
00:42 What was I thinking of..
00:42 Too many browser tabs
00:42 I'm looking at this at the moment: http://stanbol.apache.org/overview.html
00:42 But I don't think it's what I'm really looking for..
00:43 Too bad the word semantic can be used in too many different ways :p
00:44 kidclamp interesting looking though
00:45 dcook I feel a bit like LIBRIS might not be using RDF "correctly"...
00:45 And that is adding to my confusion..
00:46 http://linkeddatabook.com/editions/1.0/#htoc81
00:47 wizzyrea forget yeah
00:47 wahanui wizzyrea: I forgot yeah
00:48 dcook "Multiple Named Graphs can be represented together in the form of a graph set." that looks useful..
00:51 Wonder if I hadn't read this before or just forgot about it since November 2015..
00:53 So in this example... let's say we have an authority record about "Dave Smith"
00:54 It's time to refresh the data, so we start crawling the links... and we save the results to their own named graphs
00:54 Makes sense to me... I don't know how else you'd reasonably manage them..
00:55 Although it still seems like you could wind up with a lot of useless data over time
00:55 I suppose you could have a cleanup job...
00:56 Checking every graph if it's referred to by another graph within the triplestore...
00:56 Because let's say that Dave moves away to the US from the UK. No longer lives near Birmingham.
00:56 We have this cached Birmingham graph
00:57 Even if we were manually updating the record on Dave... I don't think it would make sense to check at deletion time if anyone else is referring to that graph... as that would be a heavy processing job..
00:58 kidclamp need the RDF equivalent of 404 page - We deleted that info as it was old and moldy
00:58 please stop linking here
00:59 dcook Yeah I think about that as well
00:59 I mean..
00:59 If you're crawling and you get a 404... what do you do?
00:59 Maybe it's a case of a web app error and it will come back
00:59 Or maybe it's gone for good
00:59 I like the idea of linked data, but...
01:00 Not sure how practical it is :/
01:02 kidclamp don't tell talljoy, sometimes i like the idea of marc records supplemented by linked data - use the work links to aggregate, but keep our march there as the base for searching etc
01:02 shhhhhh
01:02 dcook hehe
01:03 Arguably none of this is relevant for my work of course...
01:04 If I recall correctly, I'm supposed to just get the RDF in to the triplestore
01:04 But... I need to know how we're doing that a bit if I'm going to do that
01:04 So with Stockholm University Library, they're using OAI-PMH to get RDF records from LIBRIS
01:04 LIBRIS being the national library's union catalogue
01:05 Makes sense to me. Catalogue centrally is more efficient than duplicating effort across a country.
01:05 But it provides some problems..
01:05 I've been thinking about saving those triples under a named graph with the name coming from Koha
01:05 But since Thursday I've been thinking...
01:06 I should use the URI from LIBRIS
01:06 and then in Koha... just have something like owl:sameAs
01:06 or koha:derivedFrom
01:06 Something like that
01:06 Maybe... koha:oai-pmh
01:06 So that we don't recrawl it...
01:06 Since my OAI-PMH harvester is pushing up-to-date records into the triplestore
01:07 We don't need a crawl to fetch them
01:07 Of course, I think this is a bit where old world and new world clash... shouldn't need OAI-PMH in theory
01:07 Not for updates
01:07 * dcook shrugs
01:07 dcook Complicated
01:07 wahanui somebody said Complicated was far too mild a term to describe Search.pm.
01:07 dcook Ouais, wahanui, ouais
01:08 Mmm, or... instead of koha:derivedFrom or whatever
01:08 I use OAI-PMH to save the LIBRIS record to a named graph
01:08 Then... create linkages automatically with a separate Koha named graph
01:08 Named graph for a Koha bib that is
01:09 Those linkages using predicates that are pre-agreed upon
01:09 Because the question becomes... what vocabulary/vocabularies do we use in Koha?
01:10 That kind of comes upa t http://linkeddatabook.com/editions/1.0/#htoc84
01:10 "In order to understand as much Web data as possible, Linked Data applications translate terms from different vocabularies into a single target schema."
01:10 I think we can see that with OCLC
01:11 While it points to http://dbpedia.org/resource/London, it uses the "schema" schema
01:11 Instead of the schemas preferred by dbpedia
01:11 I assume they're mapped somewhere in OCLC's back-end
01:11 And that makes sense
01:11 Without those mappings... I think it would be impossible..
01:12 Folk like Oslo Public Library and LIBRIS use their own schemas...
01:12 Whether Koha comes up with its own schema or uses an existing standard..
01:12 kidclamp: And I think that's vitally important in terms of indexing the data
01:12 wizzyrea https://xkcd.com/927/
01:12 dcook wizzyrea: I think that's my favourite xkcd of all time
01:12 That and the one about encryption
01:13 https://xkcd.com/538/
01:13 That's the one
01:13 Bloody hell..
01:13 Spanner came to mind before wrench
01:13 I'm being assimilated...
01:13 kidclamp yeah, I am up in the air about choosing a schema
01:14 dcook To be honest, I think it's somethign that should've happened a long time ago
01:14 MARC is a great interchange format, but I think its limitations...
01:14 Well they've hindered us I think
01:15 kidclamp I think as long as we have a way to index any specific scheam we can get away with being fleixible, it just means a ton of mapping work - chossing one schema and running with that doesn't preclude supporting others, it just means we focus the work in one place
01:15 dcook That's a good point
01:15 Not hard-coding things++
01:16 kidclamp agnosticism++
01:18 dcook I suppose that will have to be in the SPARQL..
01:18 I really dislike how you can't really make parameterized SPARQL queries
01:19 But I suppose subjects and predicates should all be in URI format
01:19 So that should make data validation a bit easier
01:19 I keep thinking this is going to be so inefficient..
01:20 Actually, I think we'd still need a single target schema
01:20 Well one way or another..
01:21 In theory, Zebra could actually be used too.
01:21 Since it can handle any XML-based format
01:23 Vocabulary mapping... and that's how we get LIBRIS into a format Koha knows..
01:24 But check this out: https://libris.kb.se/data/oaip[…]dataPrefix=rdfxml
01:24 At the top level we have kbv:Record
01:25 Then sdo:mainEntity then bf2:Instance then bf2:title then bf2:InstanceTitle then bf2:mainTitle
01:25 Just to get the title
01:25 I guess that would look something like... kbv:Record/sdo:mainEntity/bf2:instance/b​f2:title/bf2:InstanceTitle/bf2:mainTitle in xpath..
01:30 kidclamp so simple
01:30 dcook hehe
01:30 And surely there must be non-XML based mappings..
01:32 http://data.linkedmdb.org/page/film/2014
01:32 I've seen this with LIBRIS... a mix of their own schema and established schemas
01:34 Then on dbpedia: http://dbpedia.org/page/The_Shining_%28film%29
01:34 Thte title is dbp:name
01:34 On linkedmdb they use dc:title
01:34 Let's say we were cataloguing The Shining in Koha..
01:35 You'd maybe use owl:sameAS for linkedmdb and dbpedia..
01:35 Maybe too a library database like LIBRIS, Library of Congress, National Library of Australia, etc
01:35 But then you might want to use your local schema that you could index easily..
01:36 Now that I think about it... aren't we already using microdata...
01:36 kidclamp supposedly at least :-)
01:36 but I am off for the night
01:37 dcook I don't even know where you live so I can't hassle you :p
01:37 night in any case :)
01:37 kidclamp I wish our time zones coincided when I was less tired :-)
01:37 Vermont, North East USA
01:37 UTC -5?
01:37 dcook 9:37pm, eh? Yeah I guess that's fair
01:37 That's my ideal bedtime :p
01:37 kidclamp it's my beertime
01:37 then bedtime
01:38 dcook Yeah I wish our time zones coincided more too
01:38 You have kids?
01:38 kidclamp one
01:38 but he is the best one
01:38 :D
01:38 dcook Oh man, I misread beertime as bedtime
01:38 hehe
01:38 That's what I always say too
01:38 Anyway, I won't keep you. Enjoy :)
01:38 I might send out another email... a much shorter one
01:38 kidclamp go for it, I will talk with Joy and argue and respond :-)
01:38 night
01:39 dcook hehe
01:42 Hmm... maybe using the schema.org schema would be a good place to start
01:46 rangi we have schema.org support in koha already
01:46 dcook rangi: Yeah, that's what I mean
01:46 rangi and it is extensible
01:46 dcook Well, in terms of microdata at least
01:46 rangi yep
01:46 dcook So that's really good
01:47 We could use that in a triplestore as well
01:47 rangi https://bib.schema.org/
01:47 we should use more of what is available here
01:47 * dcook gives a thumbs up
01:49 dcook So yeah... we already have embedded RDF statements using HTML+Microdata...
01:49 We could have those RDF statements in a triplestore...
01:49 rangi yep a good starting point
01:50 dcook Yeah, I think they all use literals for the moment but that's OK
01:50 As you say, good starting point
01:50 rangi have you seen this?
01:50 http://linter.structured-data.[…]dl%253A%2520fiish
01:50 dcook Nope
01:51 That's pretty coool
01:52 I wonder if there's a nicer way to do subjects than schema:keywords..
01:53 rangi probably
01:53 dcook I wish I could see what OCLC is doing behind the scenes.. as I like some of their examples
01:53 http://www.worldcat.org/oclc/973430700
01:55 rangi right
01:56 dcook Looking at theirs... I feel like <http://www.worldcat.org/oclc/973430700> is probably in a named graph
01:56 As are the "related entities"
01:56 And they've just aggregated them here on the page..
01:56 Although they do it in the downloads too
01:56 Seemingly with SPARQL's DESCRIBE, although I haven't played with that in terms of named graphs yet
01:58 Probably wouldn't be that hard..
01:58 rangi: The thing I find interesting with that example is the <http://dbpedia.org/resource/London> entry
01:58 schema:name is something that OCLC must have generated
01:59 As that triple doesn't exist in dbpedia
01:59 I reckon when they crawled the dbpedia URI... they must have run it through a mapper and only saved it with the mapped/filtered triples..
01:59 Although I suppose they could've constructed the entity for display purposes here too..
02:00 Now if you were to import this record from OCLC...
02:00 Well that's where my mind bends a bit..
02:01 You've ordered "Good Omens" as you got a purchase request for it
02:01 You subscribe to Worldcat
02:01 You need a local record in your LMS so that your library patrons can find it
02:02 Do you just do a owl:sameAs?
02:02 Maybe you fill out some basic details for your indexer?
02:03 Using owl:sameAs or some alternative... perhaps you could define some rules to crawl that entity...
02:03 And show that somewhere on your web page..
02:03 But that doesn't necessarily make sense..
02:06 rangi yep
02:06 you should really talk with teh Oslo people
02:06 * dcook agrees
02:06 rangi as they have a fully rdf based catalogue working
02:07 only ones in the world afaik
02:07 dcook I think LIBRIS might be too, but I'm not 100% sure yet
02:07 Not sure if they're still demoing or not
02:07 Need to talk to them too
02:07 rangi i think its still marc, that they render as rdf on the fly
02:07 dcook Mmm, I don't think so
02:07 I took a look a bit at their github
02:07 Or maybe it used to be..
02:08 I think now they have an editor (Catalinker)
02:08 And that saves to both the triplestore and Koha's marc database
02:08 rangi ah cool
02:08 dcook Didn't have enough time to totally go through it all
02:08 rangi (i meant libris not oslo)
02:08 dcook Do you know if that's all done by Petter or if they have others?
02:08 rangi lots and lots of others
02:08 dcook Yeah, the LIBRIS records are intense... and I'm not sure if they're 100% correct..
02:08 Not enough experience to know for sure though
02:08 rangi rurik is the main project/manager tech lead
02:09 dcook Rurik Greenall?
02:09 rangi yup
02:09 dcook I figure if I can chat to them... they might be able to clarify everything
02:09 rangi https://twitter.com/brinxmat
02:10 dcook I mean... they're really the perfect people to talk to since they're interfacing with Koha too
02:10 rangi drop him a tweet
02:10 * dcook thumbs up
02:10 rangi he was at Kohacon16 and gave a good talk at the hackfest
02:10 dcook What was the talk about?
02:11 Oh wait hackfest... those wouldn't be recorded
02:12 caboose joined #koha
02:13 dcook tweet sent
02:13 I think I can wrap my head around almost everything except copy cataloguing with RDF
02:15 rangi heh, im not @rangi im @ranginui (just fyi)
02:15 dcook Ah balls
02:15 I was going to double-check!
02:16 ibeardslee check twice, tweet once
02:16 dcook hehe
02:16 I haven't tweeted in too long... can't even find my own tweet now..
02:17 Surely it show up under my tweets..
02:17 rangi https://twitter.com/minusdavid[…]54155774943154176
02:17 dcook Cheers
02:18 Don't know why it's not showing up in my UI
02:18 Ahh, because I don't understand conventions I guess..
02:19 Tweets & replies rather than just Tweets... I guess because I @ed someone?
02:19 rangi ah yeah
02:20 dcook Hmmm https://bibflow.library.ucdavi[…]/copy-cataloging/
02:21 Gotta love super low res images..
02:21 I'm too young to be squinting..
02:22 Interesting... it seems that they do download the OCLC graph..
02:23 But maybe just into the cataloguer
02:23 That seems...
02:24 I'm intrigued by a RDF->MARC conversion as well.
02:25 As I don't see a lot of data out there about that, yet that seems to be what UCDavis and Oslo both do..
02:26 Interesting... and they're using BIBFRAME... of some kind
02:27 I do wonder a bit about using some other tools for handling RDF records... and hooking them more loosely into Koha..
02:27 Not that that is even anywhere near my problem atm..
02:30 Actually, that all looks proposed...
02:34 https://news.minitex.umn.edu/n[…]d-data-cataloging
02:35 And this year the National Library of Finland is opening up its national bibliography as linked data it seems..
02:42 So if you did download from http://www.worldcat.org/title/[…]ns/oclc/973430700
02:43 You'd.
02:43 Would you stick them all in one named graph or break them up..
02:43 If you broke them up, your schema:name would go into that named graph...
02:56 JoshB joined #koha
04:08 kholt joined #koha
04:14 kholt left #koha
05:14 magnuse joined #koha
05:14 * magnuse waves
05:15 cait joined #koha
05:18 liw joined #koha
05:19 * dcook waves
05:19 dcook Hey magnuse, chatting to @brinxmat on Twitter about LD atm
05:20 Sorry if I'm stepping on your toes at all
05:20 I think it relates to how I import/store the data though
05:25 At the moment, I'm thinking we download the records via OAI-PMH then... run them through a filter which creates triples that Koha can understand. Although we could also use SPARQL CONSTRUCT although that would in theory involve more hardcoding...
05:26 aleisha_ joined #koha
05:27 aleisha_ joined #koha
05:28 aleisha_ joined #koha
05:28 aleisha_ joined #koha
05:29 aleisha_ joined #koha
05:30 aleisha_ joined #koha
05:31 aleisha_ joined #koha
05:32 aleisha_ joined #koha
05:32 papa joined #koha
05:32 aleisha_ joined #koha
06:13 magnuse dcook: sorry, was busy in another window
06:13 brinxmat probably has good advice
06:15 mveron joined #koha
06:16 mveron Good morning / daytime #koha
06:23 josef_moravec joined #koha
06:26 josef_moravec morning #koha
06:31 dcook magnuse: No worry. I have about a million different things happening at once anyway :)
06:31 4-5 always seems the busiest time of day!
06:31 magnuse: Unfortunately, I don't think he'll be able to help too much, but it'll be good to get some more details
06:32 They dont' used named graphs though which was interesting
06:32 Although maybe that's partially because they don't do any real linking
06:32 At least to outside entities
06:32 And they don't import RDF at all :/
06:36 ashimema joined #koha
06:41 magnuse dcook: yeah, bit of a different use case, i guess
06:45 alex_a joined #koha
06:46 dcook Yeah maybe
06:46 alex_a bonjour
06:46 wahanui hello, alex_a
06:50 reiveune joined #koha
06:51 reiveune hello
06:51 wahanui salut, reiveune
06:51 AndrewIsh joined #koha
07:01 oha o/
07:08 baptiste joined #koha
07:11 sameeenz joined #koha
07:14 dcook oha... that name is familiar
07:16 oha dcook: eheh, is it because (k)oha? :)
07:17 dcook I don't think so... I'm sure I've seen your full name somewhere :p
07:18 oha dcook: oh! not sure. i've used oha for more than 20 years now.
07:20 gaetan_B joined #koha
07:20 gaetan_B hello
07:20 wahanui privet, gaetan_B
07:28 oha speaking of oha vs koha, i usually add a "oha" comment when i change something so i can easily find it again. but with so many (k)oha strings this hasn't been working so well lately :)
07:30 dcook hehe
07:30 Anyway, I better head out
07:30 Good luck Eurofolk
07:36 sophie_m joined #koha
07:42 aleisha_ joined #koha
07:43 sameee joined #koha
07:45 alexbuckley_ joined #koha
07:51 alex_a joined #koha
07:56 fridolin joined #koha
07:56 fridolin hie there
07:57 happy Easter Egg :)
07:57 paul_p joined #koha
08:11 magnus_breakfast \o/
08:21 cait joined #koha
08:27 magnuse kia ora cait
08:28 cait hi magnuse
08:28 wahanui kamelåså
08:55 mveron_ joined #koha
09:00 liw joined #koha
09:02 sophie_m joined #koha
09:08 alex_a_ joined #koha
09:09 alex_a__ joined #koha
09:18 eythian hi
09:18 wahanui niihau, eythian
09:35 wizzycray joined #koha
10:07 khall joined #koha
10:27 alex_a joined #koha
10:28 magnuse ashimema: do you think it would make sense to try and get ILL as it stands now into 17.05? i should probably ask atheia...
10:34 Feature Slush for 17.05 is May 5 2017
10:34 @later tell atheia do you think it would make sense to try and get ILL as it stands now into 17.05?
10:34 huginn` magnuse: The operation succeeded.
10:52 josef_moravec left #koha
10:55 ashimema I'd love to see it in :)
10:55 we use it in production allot
11:23 * cait waves
11:40 mveron joined #koha
11:45 oleonard joined #koha
11:45 magnuse ashimema: ah, if you use it in production that should be an "argument" for getting it in
11:45 i have started to look at adapting my existing NNCIPP code, so I might be able to do a signoff pretty soon
11:47 ashimema :)
11:47 atheia will be super happy :)
11:49 oha we will be too!
11:50 meliss joined #koha
11:57 toins joined #koha
11:57 toins hi all
11:57 nengard joined #koha
12:01 marcelr joined #koha
12:01 marcelr hi #koha
12:02 sophie_m1 joined #koha
12:03 * oleonard waves
12:09 caboose joined #koha
12:21 * mveron waves
12:21 * cait waves :)
12:21 mveron oleonard: Bug 7550 - what do you think about?
12:21 huginn` 04Bug http://bugs.koha-community.org[…]w_bug.cgi?id=7550 normal, P5 - low, ---, veron, Needs Signoff , Self checkout: limit display of patron image to logged-in patron
12:22 oleonard I will take a look
12:33 misilot joined #koha
12:59 jzairo joined #koha
13:05 tcohen joined #koha
13:11 talljoy joined #koha
13:18 oleonard mveron++
13:19 mveron :-)
13:20 JoshB joined #koha
13:22 cait mveron++ :)
13:25 oleonard cait++
13:26 * cait didn't do much this release
13:27 oleonard More than me :P
13:28 cait revamping another website? :)
13:29 oleonard Not a big project, just a lot of small ones
13:30 chris_n joined #koha
13:31 kholt joined #koha
13:41 alex_a_ joined #koha
13:58 JesseM joined #koha
14:04 JesseM joined #koha
14:07 marcelr Joubu: hi; any chance to have another look at the qa changes on the upload reports 17669/18300 ?
14:12 rkrimme1 joined #koha
14:17 Mauricio_BR joined #koha
14:17 Mauricio_BR Hello friends. Please help me with something. I am trying to locate in the database a table wich store the words searched in OPAC as anonymous user (no login in OPAC). Do you know where can I find it?
14:19 oleonard Mauricio_BR: Where do you see it?
14:20 Mauricio_BR i am looking for it in the tables of the database
14:21 i have the 16.05 ver. of Koha
14:21 oleonard Why are you looking for it?
14:21 Mauricio_BR because i am working with datamining...
14:21 kidclamp I don't think search history is sotred unless the user is logged in
14:21 Putti joined #koha
14:22 oleonard Oh I misunderstood what you were asking
14:22 Thought you were asking about a text string
14:22 Mauricio_BR oh, sorry
14:22 my english is not good as you can see...  haha
14:23 oleonard Better than my... Every other language which exists.
14:23 Mauricio_BR XD
14:24 oleonard Mauricio_BR: Have you looked at the 'search_history' table? I think kidclamp is right: I only see entries for logged-in users.
14:24 Mauricio_BR yes
14:25 but in this table every record has a user field
14:25 eythian isn't there a special user it becomes if it's anonymous that's configured by a syspref?
14:25 Mauricio_BR so it is bound to a user
14:26 there is the user with id 0 but it seems to be the admin
14:26 oleonard The AnonymousPatron pref says "(for anonymous suggestions and reading history)"
14:28 kidclamp in the code it seems non-logged in history is stored for session, but not in db
14:29 we only save in the table if we have a logged in user
14:29 Mauricio_BR http://translate.koha-communit[…]l#AnonSuggestions
14:30 eythian ah, suggestions not history. I misremembered.
14:32 Mauricio_BR yes
14:32 is only for suggestions and/or reading history items.
14:33 ok, thanks guys for the information.
14:33 oleonard Are searches logged by zebra?
14:34 kidclamp could be a development though, seems possible in existing code to support saving as anonymous patron - but would want tied to syspref
14:35 Dyrcona joined #koha
14:40 cait i think it#s in a cookie before you log in?
14:40 the search history
14:40 you could use a tool like piwik to get the searches
14:40 from the urls
14:45 jbeno_away joined #koha
14:50 Mauricio_BR Thank you Cait oleonard kidclamp ;)
15:08 jac joined #koha
15:11 barton good morning #koha!
15:12 fridolin left #koha
15:22 alex_a joined #koha
15:45 rocio joined #koha
15:47 cait left #koha
15:59 JoshB joined #koha
16:24 cait joined #koha
16:39 reiveune bye
16:39 reiveune left #koha
16:59 oleonard I'm surprised Koha doesn't have OpenSearch for the OPAC
17:00 I'm surprised I haven't wished Koha had OpenSearch for the OPAC enough yet to submit a patch.
17:02 rocio joined #koha
17:04 rocio_ joined #koha
17:43 magnuse oleonard: i thought koha had opensearch?
17:43 JoshB joined #koha
17:44 magnuse is it something else that is used to show results from kete in koha and vise versa?
17:44 oleonard It can provide OpenSearch results, if I recall correctly, but it doesn't enable auto-discovery of search to browsers
17:44 magnuse ah
17:44 oleonard So you can't add the OPAC as a search engine in Firefox
17:44 (easily)
17:44 magnuse gotcha!
17:45 patches are welcome ;-)
17:45 oleonard I hope to do so
17:56 mveron joined #koha
17:56 * magnuse will try to squeeze in a signoff
19:17 alexbuckley_ joined #koha
19:20 magnuse joined #koha
19:28 jzairo joined #koha
19:51 CrispyBran joined #koha
19:52 CrispyBran https://bugs.koha-community.or[…]_bug.cgi?id=18450
19:52 huginn` 04Bug 18450: major, P5 - low, ---, koha-bugs, NEW , Renew in header bypasses hold block and renewal limits
19:58 Joubu joined #koha
20:00 Joubu I'd like a English native speaker to take a look at bug 18432
20:00 huginn` 04Bug http://bugs.koha-community.org[…]_bug.cgi?id=18432 trivial, P5 - low, ---, ephetteplace, Signed Off , Most code comments assume male gender
20:00 Joubu to me it does not make sense to make these changes as it is only in code comments
20:02 CrispyBran Joubu: really?  Someone is going to waste programmer time with this?
20:02 Joubu can be replaced by she or whateber
20:02 whatever
20:02 but "he or she" just make things heavy to read IMO
20:02 wahanui and ever, amen.
20:02 * Joubu trolls and goes out
20:05 * CrispyBran thinks all references to librarians should be in animal or vegetable form.
20:05 * cait waves
20:06 cait CrispyBran: can you demonstrate this in a sentence? :)
20:06 CrispyBran 'When a carrot manages the suggestion, it can set the status to "REJECTED" or "ACCEPTED".'
20:06 * cait has to admit a 'he' in a comment doesn't stop me
20:06 cait hm stop her
20:06 ?
20:06 lol
20:07 when a notice with the code ACCEPTED is set up, a message will be sent from the kitten to the patron.
20:07 ?
20:08 JesseM joined #koha
20:08 cait hm this vegetable needs to do laundry, brb
20:09 CrispyBran :)
20:12 kholt joined #koha
20:14 magnuse joined #koha
20:16 alexbuckley joined #koha
20:17 CrispyBran I am not sure how reference to a particular gender as an EXAMPLE proves to be problematic.  If a programmer has issue with this, there are deeper issues that should be addressed, rather than taking offense at an example.
20:19 If we changed the language to female, I seriously doubt we'd lose any of the male programmers.  Anyway, that's all the energy I can contribute to this topic.  Moving on to problematic code.
20:47 rangi just use they
20:48 problem solved
20:48 there is no reason to ever need to use gendered language in an example
20:52 http://timesofindia.indiatimes[…]show/55395836.cms
20:52 it is actually important
20:53 in fact, im gonna put my time where my mouth is and do a follow up doing that
20:56 cait if we want to change it permanently, maybe we should also have a coding guideline
20:56 so people are more aware?
20:56 i think some of the he/she/they problem is non native speakers not being aware of the neutral forms and how to use them
20:57 i didn't know until not so long ago
20:57 mveron joined #koha
20:59 rangi yeah, māori has no gendered pronouns so it is easy
20:59 and in english, they can be singular or plural, so easy to always use that
21:00 if we get the base neutral, individual communities can decide how to deal with the translations
21:01 Joubu the patch is only about comments, so no translation needed
21:01 apparently the problem does not appear on the interface
21:01 rangi cool, ill update it now(ish)
21:07 Joubu: thanks for the work on the onboarding tool
21:07 and its good to see hea2 live too
21:08 cait Joubu++
21:08 rangi If the user logged in is the SCO user and they try to go out the SCO module, log the user out removing the CGISESSID cookie
21:08 i think that is less clunky than he or she eh?
21:09 ill also fix the english
21:09 to out of the SCO :-)
21:11 Joubu yep, apparently hea will not be backported this month, but should be next month :)
21:11 cait i'd like to get the translators on it too - but will try to push it eraly
21:12 have to figure out the schema changes too i think
21:13 Joubu cait: do not worry with the schema changes, now we have check to avoid failures on upgrade
21:13 a simple c/p of the DB rev from master should be enough
21:13 cait just something i haven't done so far :)
21:13 oh?
21:13 so i don't need to regenerate?
21:14 alexbuckley Yes thank you for the work on the onboarding tool Joubu
21:14 Joubu ha dbic schema you meant, yes you will have
21:14 thx for the quick signoff alexbuckley :)
21:15 alexbuckley No worries :)
21:15 Joubu just hope you did not signoff because you gave up!
21:16 have to run, see you tomorrow :)
21:16 alexbuckley Nope I did like your changes, better to see fewer screens  by having the additional info about how to create things on the same page as the forms for example
21:19 magnuse joined #koha
21:23 wizzyrea confetti!
21:23 wahanui confetti is, like, http://25.media.tumblr.com/tum[…]1qh8hleo1_400.gif
21:23 wizzyrea more confetti!
21:23 wahanui o/ '`'`'`'`'`'`'`'`'`'`'`'`'`'`'​`'`'`'`'`'`'`'`'`'`'`'`'`'`'
21:24 talljoy joined #koha
21:25 rangi hi talljoy
21:25 talljoy hiya rangi!
21:28 JoshB joined #koha
21:29 JoshB joined #koha
21:35 caboose-afk joined #koha
21:35 caboose-afk joined #koha
22:01 caboose joined #koha
23:09 CrispyBran Joubu: when you create a patch and someone says it doesn't apply, what do I need to do on my end?
23:09 wizzyrea rebase it, probably
23:10 (but it depends on the error message)
23:12 CrispyBran do I just to a git pull on the master, test my patch again and then obsolete and attach again?
23:12 wizzyrea that might do it
23:12 you could also check out a clean master, and cherry pick your patch over
23:12 if it works, yay
23:13 reattach
23:13 if not, find out why, and fix that.
23:13 the approach really depends on how it's not applying
23:13 fixing a "insufficient blobs" is way different from say, a merge conflict
23:14 either way you'll know when you go to bz apply the patch on current master.
23:14 so that's where to start.
23:22 CrispyBran Thanks for the info.
23:22 Have a good one.
23:33 alexbuckley_ joined #koha
23:35 dilan joined #koha
23:40 dilan joined #koha
23:44 talljoy joined #koha
23:47 irma joined #koha
23:58 papa joined #koha

← Previous day | Today | Next day → | Search | Index

koha1