Semantic Web Interest Group IRC Chat Logs for 2010-02-12

This is an automatically generated IRC chat log made by the perl IRC logger bot from the Semantic Web Interest Group IRC chat at server irc.freenode.net channel #swig. Provided by Planet RDF.

See also the Semantic Web Interest GroupIRC Scratchpad for the collaboratively written weblog and ESW wiki.


Semantic Web Interest Group Logs > 2010 > 2010-02 > 2010-02-12 (Latest) (Search)

00:13:13 <dajobe> yeah

00:15:03 * dajobe git commits that to issue list

08:55:40 <mhausenblas> morning Web of Data

08:56:14 <mhausenblas> https://groups.google.com/group/webfinger/browse_thread/thread/fb56537a0ed36964

08:56:15 <dc_swig> A: https://groups.google.com/group/webfinger/browse_thread/thread/fb56537a0ed36964 from mhausenblas

08:56:34 <mhausenblas> A:| webfinger enabled for all gmail accounts with public profiles

08:56:35 <dc_swig> Titled item A.

08:57:00 <mhausenblas> A: using LRDD http://tools.ietf.org/html/draft-hammer-discovery-03

08:57:02 <dc_swig> Added comment A1.

08:57:20 <mhausenblas> A: for example, curl http://www.google.com/s2/webfinger/?q=Michael.Hausenblas@gmail.com

08:57:21 <dc_swig> Added comment A2.

08:57:47 <mhausenblas> A: how long will it take till we have an RDF view of it, mapped to FOAF, DC, vCard, whatever?

08:57:49 <dc_swig> Added comment A3.

09:01:33 * mhausenblas running now to write an XSLT ... ;)

09:04:07 <mhausenblas> A: related post by Tim Finin http://ebiquity.umbc.edu/blogger/2009/08/15/webfinger-a-finger-protocol-for-the-web/

09:04:08 <dc_swig> Added comment A4.

09:05:15 <mhausenblas> morning, danbri

09:05:24 <mhausenblas> did you see the cool news already?

09:05:44 <mhausenblas> A: and o'course bblfish's kewl blog post re this: http://blogs.sun.com/bblfish/entry/web_finger_proposals_overview

09:05:46 <dc_swig> Added comment A5.

09:05:54 <danbri> news?

09:06:11 <mhausenblas> A:

09:06:20 <dc_swig> https://groups.google.com/group/webfinger/browse_thread/thread/fb56537a0ed36964

09:06:20 <dc_swig> webfinger enabled for all gmail accounts with public profiles

09:06:20 <dc_swig> (1:mhausenblas) using LRDD http://tools.ietf.org/html/draft-hammer-discovery-03

09:06:20 <dc_swig> (2:mhausenblas) for example, curl http://www.google.com/s2/webfinger/?q=Michael.Hausenblas@gmail.com

09:06:20 <dc_swig> (3:mhausenblas) how long will it take till we have an RDF view of it, mapped to FOAF, DC, vCard, whatever?

09:06:20 <dc_swig> (4:mhausenblas) related post by Tim Finin http://ebiquity.umbc.edu/blogger/2009/08/15/webfinger-a-finger-protocol-for-the-web/

09:06:20 <dc_swig> (5:mhausenblas) and o'course bblfish's kewl blog post re this: http://blogs.sun.com/bblfish/entry/web_finger_proposals_overview

09:06:56 <danbri> is the foaf stuff in there still?

09:07:00 <danbri> he was working on it...

09:07:13 <mhausenblas> are you aware of any XSLT that maps the XRD to FOAF, danbri?

09:07:22 <danbri> nope

09:07:31 * mhausenblas now really motivated

09:07:49 * mhausenblas firing up TextMate and hacks something together

09:17:43 <danbri> try rapper 'http://s2.googleusercontent.com/webfinger/?q=danbrickley%40gmail.com&fmt=foaf'

09:17:51 <danbri> what's the official webfinger endpoint now?

09:18:20 <SimpsonTP> any1 going to amsterdam this afternoon ?

09:18:31 <mhausenblas> very cool, danbri, thanks!

09:18:48 <mhausenblas> who is behind this service?

09:18:57 <danbri> it's brad fitzpatrick, the guy who made the main thing

09:19:11 <mhausenblas> wow

09:19:26 <danbri> .g "google webfinger prototype updates"

09:19:26 <phenny> danbri: No results found for '"google webfinger prototype updates"'.

09:19:38 <danbri>http://groups.google.com/group/webfinger/browse_thread/thread/304a4fbc990abb4c

09:19:39 <dc_swig> B: http://groups.google.com/group/webfinger/browse_thread/thread/304a4fbc990abb4c from danbri

09:20:07 <danbri> that curl link you gave points at same thing?

09:20:10 <danbri> <Link rel='describedby' href='http://s2.googleusercontent.com/webfinger/?q=Michael.Hausenblas%40gmail.com&amp;fmt=foaf' type='application/rdf+xml'/>

09:20:39 <mhausenblas> yeah.

09:20:59 <mhausenblas> maybe, sometimes, I should read a bit more before firing up an editor ;)

09:21:32 <mhausenblas> though, frankly, googleusercontent.com is totally new to me

09:21:55 <mhausenblas> but hang on

09:22:16 <mhausenblas> not sure if http://s2.googleusercontent.com/webfinger/?q=Michael.Hausenblas%40gmail.com&amp;fmt=foaf really is what I am after ...

09:22:43 * danbri mails bradfitz to ask for s/holdsAccount/account/

09:23:02 <danbri> it is a very basic foaf file but could be fleshed out

09:23:35 <mhausenblas> right

09:23:47 <mhausenblas> I thought of having an XRD2RDF, though

09:24:31 <danbri> i think we'll get it from source

09:25:45 <mhausenblas> ?

09:43:58 <gromgull> Does anyone know their way around virtuoso configuration? I've got a local mirror of dbpedia, but loading the HTML views always takes forever, for example: http://pc-4323:8890/about/html/http://dbpedia.org/resource/Cake (obviously wont load for you) takes 30s to load, but the corresponing json or xml query results come instantly

09:46:25 <iv_an_ru> gromgull, we're using (not yet released for public) Virtuoso Cluster Edition. For a single box, the more RAM the better :)

09:47:17 <gromgull> But I do not think the issue is ram... loading the data for a thing is quick enough. The box has 8GB for the virtuoso server...

09:47:35 <gromgull> I wonder if the sponger somehow gets invoked for this about/html/* URI

09:47:58 <danbri>http://www.cl.cam.ac.uk/research/security/banking/nopin/

09:47:59 <dc_swig> C: http://www.cl.cam.ac.uk/research/security/banking/nopin/ from danbri

09:48:00 <gromgull> while waiting the CPU load on the server is almost zero

09:48:17 <danbri> C:|Chip'n'pin oopsie

09:48:18 <dc_swig> Titled item C.

09:48:30 <danbri> (or, why I try to avoid protocol design...)

09:51:44 <iv_an_ru> gromgull, I've forwarded the question to Mitko Iliev; he wrote the final version of the thing, hope he will join the channel

09:53:54 <gromgull> thanks iv_an_ru!

09:54:13 <gromgull> Teh about/html and about/rdf URIs belong to the rdf_mapping pacakge

09:54:44 <gromgull> it seems to be closely tied to the sponger? I wonder if this is how dbpedia generates their html view...

09:55:29 <imitko> hi all

09:55:56 <gromgull> Hi, imitko

09:56:18 <imitko> gromgull: the dbpedia html pages are not part of sponger, even they look like same

09:57:18 <gromgull> mitko: so the about/html/* URIs exposed by the rdf_mapper package

09:57:25 <gromgull> are sponger backed?

09:57:43 <gromgull> What do I use if I want an html view of only my local data?

09:59:05 <imitko> yes, about/html* is the sponger service. if you want to show only local data , you may need to tweak the pages from sponger

09:59:41 <gromgull> I have tried deleted all the sponger patterns - hoping it would just use local data then

10:00:18 <besbes_> besbes_ is now known as besbes

10:00:30 <gromgull> "the cartridges" in virtuoso lingo

10:00:55 <gromgull> I see the sparql endpoint has an extra "should_sponge=true|false" parameter

10:00:57 <imitko> as for html view of local dbpedia data you will need new vad package, we are upload atm

10:01:15 <gromgull> ok

10:08:37 <MichelvT> hi there

10:09:27 <MichelvT> I have a question about SPARQL endpoints...

10:10:18 <MichelvT> I read some literature about them, but I now feel like a SPARQL endpoint can be created to tie all Semantic Web data objects together, regardless of the format... Is this the way I should think about it?

10:14:31 <Wikier> https://sourceforge.net/mailarchive/forum.php?thread_name=1265969210.4731.190.camel%40tejo-portatil&forum_name=sparql-wrapper-devel

10:14:33 <dc_swig> D: https://sourceforge.net/mailarchive/forum.php?thread_name=1265969210.4731.190.camel%40tejo-portatil&forum_name=sparql-wrapper-devel from Wikier

10:14:34 <gromgull> MichelvT: your interpretation is not quite hte normal one

10:14:46 <MichelvT> okay

10:14:51 <Wikier> D:|SPARQLWrapper 1.4.1

10:14:53 <dc_swig> Titled item D.

10:14:59 <gromgull> a sparql endpoint is the address of a web-service that adheres to the sparql protocol

10:15:05 <gromgull> it will answer sparql queries about something

10:15:15 <gromgull> in general only the local data stored by this particular server

10:15:16 <gromgull> for instance

10:15:20 <MichelvT> I encountered this FAQ: http://www.thefigtrees.net/lee/sw/sparql-faq

10:15:31 <gromgull> dbpedia.org has wikipedia mapping to rdf - the endpoint at dbpedia.org/sparql

10:15:37 <gromgull> answers sparql queries about this

10:16:03 <gromgull> NOW - there is nothing stopping a sparql endpoint server from going out to the wider web and finding more answers to your query

10:16:08 <gromgull> but i would not say that this is the usual case

10:16:12 <MichelvT> okay

10:16:46 <MichelvT> I believe that in its most used application there is a triple store *or quad store* behind a SPARQL endpoint ?

10:17:03 <gromgull> exactly

10:17:47 <MichelvT> okay :)

10:17:48 <gromgull> but there are things like squin: http://squin.sourceforge.net/

10:17:52 <Anchakor> gromgull: though coincidentally what MichelvT describes is the way of virtuoso with it sponger

10:18:09 <gromgull> where the data is fetched from the web as a whole, the local triple store is a cache only

10:18:40 <gromgull> Anchakor: and sponger of course, that we just talked about. This goes even one step further, collecting semi-structured information from other formats as well, not just linked data

10:19:39 <mhausenblas> ok, a brain dump re the WebFinger stuff and discovery etc.:

10:19:40 <mhausenblas>http://webofdata.wordpress.com/2010/02/12/google-lod-cloud-contributor/

10:19:42 <dc_swig> E: http://webofdata.wordpress.com/2010/02/12/google-lod-cloud-contributor/ from mhausenblas

10:19:51 <mhausenblas> E:| Is Google a large-scale contributor to the LOD cloud?

10:19:53 <dc_swig> Titled item E.

10:23:48 <MichelvT> I was trying to make a distinction of how RDF data can be stored and ultimately accessed by a Semantic Web browser

10:24:07 <MichelvT> and thus I came to the conclusion that the SPARQL endpoint is an important peer in this architecture

10:24:36 <MichelvT> ;)

10:25:15 <MichelvT> but I'm still not sure how a SPARQL endpoint is 'defined', as it is rather abstractly defined by W3C

10:26:05 <Anchakor> mhausenblas: though the FOAF data in rdfxml on http://s2.googleusercontent.com/webfinger/?q=Michael.Hausenblas%40gmail.com&fmt=foaf doesn't parse

10:26:35 <Anchakor> in rapper

10:26:36 <gromgull> MichelvT: It's defined as a URI that speaks the SPARQL HTTP protocol?

10:27:02 <MichelvT> ah okay! thanks gromgull

10:27:07 <MichelvT> it's just that

10:27:20 <gromgull> I.e. go here: http://dbpedia.org/sparql , copy paste in select distinct ?Concept where {?Concept <http://xmlns.com/foaf/0.1/name> "Albert Einstein" }

10:27:43 <gromgull> click query and you see the http uri that gives you the answers to "who has the name 'Albert Einstein'" ?

10:27:57 <gromgull> This beautiful uri: http://dbpedia.org/sparql?default-graph-uri=http%3A%2F%2Fdbpedia.org&should-sponge=&query=select+distinct+%3FConcept+where+{%3FConcept+%3Chttp%3A%2F%2Fxmlns.com%2Ffoaf%2F0.1%2Fname%3E+%22Albert+Einstein%22+}&format=text%2Fhtml&debug=on&timeout=

10:28:16 <gromgull> you can change format=text/html to format=json for something that a program can parse easyily

10:28:17 <gromgull> easily

10:28:58 <MichelvT> okay, so in theory every SPARQL query can be assigned a URI too

10:29:20 <mhausenblas> Anchakor, what do you mean with doesn't parse?

10:29:58 <gromgull> MichelvT: i guess so - although they are not really unique... i can express the same query using different triple ordering, different variables names, etc.

10:30:10 <gromgull> the improtant part is the HTTP get on this returns the result

10:30:49 <MichelvT> okay

10:31:51 <imitko> gromgull: can try the latest dbpedia_dav.vad avalable http://bit.ly/aMscZI , it has fixes about speed issue

10:32:54 <mhausenblas> Just loaded http://s2.googleusercontent.com/webfinger/?q=Michael.Hausenblas%40gmail.com&fmt=foaf into a local ARC2 store and queried it (as sparql.org is down, again)

10:32:58 <mhausenblas> works perfectly fine

10:33:17 <MichelvT> mhausenblas: I wouldn't understand why it should not parse

10:33:26 <MichelvT> it is perfectly valid as far as I can see

10:33:30 <mhausenblas> yes

10:33:49 <mhausenblas> that's why I asked Anchakor what he means by it ... ;)

10:33:53 <MichelvT> :)

10:34:04 <mhausenblas> maybe a SNAFU in my blog post?

10:34:36 <gromgull> loads fine into rdflib

10:51:55 <MichelvT> I have tried to identify some containers of RDF data

10:52:13 <MichelvT> such as RDF documents (.xml, .n3, .ttl, .nt)

10:52:27 <MichelvT> and such as semantically enhanced HTML files

10:52:38 <MichelvT> but I cannot really relate a SPARQL endpoint with it

10:52:48 <MichelvT> should I see it as some sort of wrapper ? :P

10:53:21 <shellac_> you mean you want to query the data using sparql?

10:54:51 <shellac_> you can use a number of online services to do that. many will load data in FROM or FROM NAMED clauses

10:55:00 <MichelvT> okay... :P

10:55:06 <MichelvT> but just to get it clear

10:55:13 <MichelvT> SPARQL itself does not contain data right?

10:55:22 <MichelvT> then I do understand what it is all about :)

10:55:24 <MichelvT> well

10:55:28 <MichelvT> at least some of it :P

10:55:35 <gromgull> SPARQL is just the query langauge - like SQL to a relation database

10:55:47 <MichelvT> but the P stands for Protocol

10:55:47 <gromgull> rdfxml, n3, etc. are RDF interchange formats

10:56:00 <MichelvT> and that is the HTTP protocol in this case ?

10:56:09 <gromgull> yes, it is also the HTTP protocol and results format for SPARQL endpoitns

10:56:15 <MichelvT> okay

10:56:39 <shellac_> http://www.w3.org/TR/rdf-sparql-protocol/ -- http://..../endpoint?query=...

10:56:57 <gromgull> serving RDFXML or N3 data using HTTP has nothing to do with SPARQL

10:57:04 <gromgull> this is Linked data

10:57:59 <MichelvT> I understand

10:58:07 <shellac_> the sparql graph pattern does look like rdf data, but with gaps where you want to find something

10:58:15 <MichelvT> SPARQL is 'just' one way of browsing linked data then

10:58:24 <MichelvT> and URI dereferencing is another one

10:59:27 <shellac_> I'd only call the latter browsing.

11:00:38 <shellac_> but I guess you could argue the former (for those sparql services which load data on the fly)

11:01:02 <gromgull> iv_an_ru: imitko's dbpedia vad works wonders! Please thank him for me!

11:02:35 <MichelvT> okay thank you shellac_ and gromgull

11:02:35 <MichelvT> :)

11:04:42 <fidothe_> fidothe_ is now known as fidothe

11:05:08 <MichelvT> I will still have to rethink how a Semantic Web browser should then use these kinds of communication then

11:05:16 <MichelvT> but you got me thinking :)

11:05:56 <Anchakor> mhausenblas: means rapper chokes on it. I guess rapper bug then

11:06:02 <mhausenblas> yup

11:08:05 <iv_an_ru> gromgull, I've relayed it to Mitko, he is fighting now with his chat cliet that upgraded itself and died :)

12:17:56 <danbri>http://twitter.com/#search?q=%23dswm

12:17:58 <dc_swig> F: http://twitter.com/#search?q=%23dswm from danbri

12:20:35 <libby> danbri http://search.twitter.com/search?q=%23dswm is a better lin as you have to be logged in for taht one

12:20:40 <libby> *link

12:22:07 <shellac_> that works for me now

12:22:20 <shellac_> I mean dan's original

12:22:26 <shellac_> it didn't used to

12:22:57 <danbri> ah thx

12:24:33 <libby> ah k

12:29:16 <shellac_> I bet the Department of Solid Waste Management aren't happy with that tag ;-)

13:04:50 <danbri> who else here is at the dutch semweb meetup right now?

13:10:41 <ghard> me

13:11:10 * enjayhch has just released RedStore version 0.1

13:11:12 <enjayhch>http://code.google.com/p/redstore/

13:11:13 <dc_swig> G: http://code.google.com/p/redstore/ from enjayhch

13:11:31 <enjayhch> RedStore is a lightweight RDF triplestore written in C using the Redland library.

13:12:26 <libby> G:|RedStore is a lightweight RDF triplestore written in C using the Redland library.

13:12:28 <dc_swig> Titled item G.

13:13:02 <mhausenblas> enjayhch++

13:13:05 <mhausenblas> very cool

13:13:17 <mischat> oooo

13:13:20 <mischat> nice one nick

13:13:22 <gromgull> Hmm - sesame cannot read any of dbpedia sparql output formats...

13:13:44 <enjayhch> mhausenblas: it is very alpha - needs some work still

13:13:55 <enjayhch> but thought I should get something out the door

13:14:12 <mhausenblas> did anyone test it under MacOS, yet, enjayhch ?

13:14:27 <enjayhch> mhausenblas: I wrote it under Mac OS :)

13:14:38 <mhausenblas> very very cool!

13:14:57 <enjayhch> I am going to distribute a static Mac binary...

13:15:01 * mhausenblas goes svn co ...

13:15:09 <mhausenblas> great

13:15:24 <danbri> ghard, heh yep, located you already :)

13:17:09 <mischat> gromgull: the dbpedia sparql-result format parses ok for me

13:18:04 <gromgull> mischat: I tried to parse the rdf/xml results - then sesame complains about rdf:nodeID for each solution

13:18:08 <gromgull> or the ntriples

13:18:16 <gromgull> then it complains about the _:_ bnode IDs

13:18:42 <gromgull> I guess I could parse the sparql XML format...

13:18:55 <gromgull> but then things get complicated, since I use rdf2go on top of sesame, but that is my own fault.

13:19:15 <mischat> mmm, i use this command line tool to parse arbitrary sparql endpoint

13:19:16 <mischat> github.com/tialaramex/sparql-query/

13:19:27 <mischat> sure

13:19:29 <shellac_> gromgull: do you have an example?

13:19:45 * danbri goes into powersaving mode

13:19:52 <mischat> enjayhch: am going to have a play with your RedStore soon, exciting stuff

13:20:00 <shellac_> so you think it's the form of the bNode id

13:20:37 <enjayhch> mischat: :)

13:20:46 <gromgull> mischat: but this is not using sesame?

13:21:01 <gromgull> shellac_: http://dbpedia.org/sparql?default-graph-uri=http%3A%2F%2Fdbpedia.org&should-sponge=&format=text%2Fplain&debug=on&query=SELECT+%3Fx+WHERE+{+%3Chttp%3A%2F%2Fdbpedia.org%2Fresource%2FAAA%3E+%3Chttp%3A%2F%2Fdbpedia.org%2Fproperty%2Fabstract%3E+%3Fx+}

13:21:23 <gromgull> here the bnode id for the result-set is _:_ - it may well be legal ntriples... but sesame wont parse it

13:22:07 <shellac_> I don't think it is legal

13:22:21 <shellac_> http://www.w3.org/2001/sw/RDFCore/ntriples/#name

13:22:54 <gromgull> indeed - so it seems

13:23:01 <gromgull> then virtuoso produces illegal ntriples

13:23:03 <mischat> nope it isn't

13:23:09 <gromgull> oh well

13:23:16 <shellac_> so +1 sesame, but oh dear

13:23:32 <mischat> gromgull: not using sesame, was just confirming that the output of dbpedia's sparql-result XML seems sane

13:24:28 <gromgull> shellac_: alternatively, here with rdf/xml http://dbpedia.org/sparql?default-graph-uri=http%3A%2F%2Fdbpedia.org&should-sponge=&format=application%2Frdf%2Bxml&debug=on&query=SELECT+%3Fx+WHERE+{+%3Chttp%3A%2F%2Fdbpedia.org%2Fresource%2FAAA%3E+%3Chttp%3A%2F%2Fdbpedia.org%2Fproperty%2Fabstract%3E+%3Fx+}

13:24:42 <gromgull> here the res;solution rdf:nodeID="blah" gives problems

13:24:59 <gromgull> I'm happy to see that it's so long since I had to deal with rdf/xml I have no idea if this i sok

13:25:12 <shellac_> ok, now I scratch my head because that looks fine

13:25:44 <shellac_> hang on, this is sparql result set in rdf?

13:25:59 <gromgull> yes

13:26:09 <shellac_> why not use the normal result set format?

13:27:08 <gromgull> this is my own fault - because I did not have a parser for this handy - I wanted to just reuse the rdf/xml/ntriples parser I had

13:27:17 <gromgull> but this is because of our weird rdf2go on top of sesame setup

13:27:20 <shellac_> does sesame still object to that rdf?

13:27:25 <gromgull> yes

13:27:57 <gromgull> Caused by: org.openrdf.rio.RDFParseException: unexpected attribute 'rdf:nodeID' [line 6, column 8]

13:28:33 <shellac_> oh yes, that rdf/xml is completely borked

13:29:06 <gromgull> yes, I don't thin you can nest more properties inside somethign that has a nodeid - it's like a bastard child of nodeif and parseType=resource

13:29:08 <shellac_> I was looking for the bnode problem again, but this is a striping problem

13:29:40 <shellac_> who wrote this result serializer, and has anyone ever used it successfully?

13:30:06 <gromgull> again it's from virtuoso

13:31:08 <shellac_> are you looking for a hacky work around?

13:31:29 <shellac_> if so I'd filter the ntriples to change _:_ to _:U

13:32:20 <gromgull> i dont know who wrote it ...

13:32:21 * gromgull goes back to sesame's SparqlXMLResultParser

13:32:42 <gromgull> hmm ... I like that solution :)

13:34:26 <shellac_> let the dbpedia / virtuoso people know.

13:34:49 <ghard> We do already ;)

13:34:59 <ghard> I'm looking at it.

13:36:13 <ghard> There's some question about whether it is correct to return RDF at all for SELECT

13:38:21 <shellac_> well I think it's ok in principle, but it seems like it could well lead to more confusion than anything else

13:38:40 <gromgull> is the rdf resultset mapping not defined by some sparql doc?

13:39:44 <mhausenblas> enjayhch, tiny nit, but did cost me as an unexperienced guy now some 15min ...

13:39:54 <mhausenblas> you might want to add a line "run autogen.sh" first

13:40:06 <mhausenblas> before "./configure"

13:40:20 <mhausenblas> on the google code page a http://code.google.com/p/redstore/

13:40:49 <mhausenblas> I know, I know, this is likely a no-brainer for experts/hacker, but ...

13:41:06 <ghard> Was trying to look for it. Not 100% sure about what happens with the format as URI parm but SELECT with Accept: application/rdf+xml, appication/sparql-results+xml should always return a vanilla SPARQL result set.

13:41:16 <shellac_> could someone check that the buttons in http://rdf-in-html.appspot.com/ are working in their browser?

13:41:38 <ghard> Not in older builds though.

13:41:49 <ghard> This was changed a while ago.

13:42:31 <mhausenblas> shellac: seems to work (FF3.5 under MacOS)

13:42:47 <mhausenblas> the parse btn in the "Parsing a web page" triggers a 500

13:43:15 <mhausenblas> the parse btn in the "Parsing content directly" section creates something in the textarea next to it ...

13:43:16 <mhausenblas> hth

13:43:33 <shellac_> thanks. yep, not concerned about that (it is the right answer). the POST seemed to be taking a long time

13:43:41 <mhausenblas> oki

13:43:54 <shellac_> but maybe that was just first deploy jitters

13:43:57 <shellac_> thanks mhausenblas

13:44:27 <gromgull> ghard, shellac_: the ntriples is even worse, it also uses prefixes, rdf:type, i.e. not the full URI

13:44:49 <gromgull> but I guess this must be only for the query serialisation? While exporting ntriples normally virtuoso is not so broken I guess

13:45:12 <gromgull> ghard, shellac_: the ntriples is even worse, it also uses prefixes, rdf:type, i.e. not the full URI

13:45:37 <mischat> agree the ntriples is way dirty

13:45:38 <shellac_> yes, I think this was ghard's point.

13:45:48 <shellac_> not a well-trodden path at all

13:47:54 <enjayhch> mhausenblas: you only have to run ./autogen.sh if checking out from subversion. I need to write some instructions for compiling from Subversion then

13:47:55 <ghard> Yep. Will check with others in the company re. the way BNODE ids are generated.

13:48:54 <ghard> Gotta move.

14:02:18 <mhausenblas> yup, enjayhch that's what I did ;)

14:07:05 <mhausenblas> hm, seems I need glibtoolize ... how do I get this on MacOS?

14:07:34 <mhausenblas> enjayhch, I need to have the Redland storage installed, right?

14:07:48 <mhausenblas> (in order to build redstore, I mean ;)

14:08:28 <ghard> Re. the SPARQL results set format. The behaviour was modified, as I was testing one of our clients against UK govt data endpoints.

14:09:17 <mhausenblas> anyone? how the heck do I get the damn glibtoolize ...

14:10:09 <mischat> do you have a compiler on your mac mhausenblas ?

14:10:19 <mischat> it should come with Xcode Tools

14:10:34 <mischat> i have it on mine ;)

14:12:40 <mhausenblas> hm, I have Xcode installed

14:13:32 <mhausenblas> but Redland seems to expect it and I do need Redland for redstore, right?

14:19:15 <mischat> ah

14:19:25 <mischat> i just ran `which glibtoolize`

14:19:33 <mischat> and it seems that I installed it via fink

14:19:50 <mischat> fink install libtool

14:20:03 <mischat> or if you would rather port

14:20:12 <mischat> port install libtool probably works too

14:34:29 <mhausenblas> right, thanks mischat! I found http://redland.darwinports.com/ now ... tried this in the mean time

14:36:08 <iv_an_ru> shellac_, gromgull, OK I'll make it _:supposedlyUniqueIdentifierThatNotConflictWithAnyPersistentBnodeThatCanBeMadeByAndysSoft :)

14:36:52 <gromgull> :)

14:37:44 <shellac_> people who ask for rdf from SELECT queries deserve such bNodes :-)

14:40:16 <enjayhch> mhausenblas: yeah, you need Redland installed

14:42:00 * enjayhch uses fink

14:42:24 <mhausenblas> ok

14:42:26 <mhausenblas> thanks

14:49:36 <ktkNA> ktkNA is now known as ktk

15:01:36 <scor> mhausenblas: I remember issues with glibtoolize when trying to compile redland tools some time ago.. I could not find a solution. I ended up having to use a pre-compiled packaged (as opposed to checking out from svn)

15:06:04 <dajobe> fink or macports - both work. redland libraries are built on osx all the time. compiling from tarballs will save you lots of hassle

15:29:05 <iv_an_ru> gromgull, _:_ is tweaked in CVS, will be visible to public on dbpedia.org soon.

15:29:30 <gromgull> thanks iv_an_ru!

15:29:42 <gromgull> meanwhile I changed to using the proper xml result format

15:29:49 <gromgull> but it's still nice ot have it fixed :)

16:15:43 <mhausenblas> thanks both scor and dajobe ;)

16:19:27 <jmv> Hi is there a translator from simple Prolog or Datalog to N3 logic or SWRL ?

16:20:06 <yvesr> jmv: i wrote the inverse :) a n3 to prolog translator

16:20:59 <jmv> Yes , I use another one, Euler all the time

16:21:10 <yvesr> oh, ok

16:21:19 <jmv>http://eulersharp.sourceforge.net/

16:21:21 <dc_swig> H: http://eulersharp.sourceforge.net/ from jmv

16:21:24 <jmv> where is yours ?

16:21:40 <yvesr>http://code.google.com/p/km-rdf/

16:21:42 <dc_swig> I: http://code.google.com/p/km-rdf/ from yvesr

16:21:46 <yvesr> (Henry)

16:23:24 <jmv> in Euler , Jos has also written a Prolog parser for N3 + rules

16:23:42 <yvesr> yup, there is a dcg in henry too

16:24:00 <yvesr> i should look more at euler

16:24:28 <yvesr> the dcg is htere: http://code.google.com/p/km-rdf/source/browse/trunk/n3/n3_dcg.pl

16:24:28 <jmv> good to know

16:27:17 <Anchakor> mhausenblas: actually it wasn't rappers bug, but mine, forgetting ' around a URI so & was interepreted by shell it tried to parse the XRD version with rdf/xml parser

16:27:36 <jmv> yvesr, the dcg , and all the rule engine in 1 file : http://eulersharp.svn.sourceforge.net/viewvc/eulersharp/trunk/2006/02swap/euler.yap

16:27:57 <jmv> Although I can write some Prolog, Jos de Roo is the guru here

16:28:17 <yvesr> i remember why i never really looked into it - i can't understand anything :-)

16:28:35 <jmv> There is also a SWI version but less uptodate AFAIK

16:28:51 <yvesr> mine is fairly heavily relying on swi

16:28:52 <jmv> me too

16:29:03 <jmv> not enough comments

16:29:19 <jmv> but the dcg is well separated

16:29:26 <yvesr> the skolemisation is the hardest bit, i thought, when right a n3 engine in prolog

16:30:12 <jmv> I made a small IDE in Java, from which one can run several N3 logic engines:

16:30:45 <jmv> Euler, FuXi, CWM, and my child based on Drools, a RETE engine

16:30:55 <jmv> it's called EulerGUI

16:34:06 <jmv> yvesr, Euler in used a novel logic form , called Coherent Logic , that avoids the need of skolemisation

16:34:18 <jmv> is using

16:35:15 <jmv> yvesr, I'll aske my original question on the prolog channel ...

16:45:18 <jmv> yvesr, I think I could use your Prolog dcg backwards ?

16:45:41 <jmv> any DCG is usable either way

17:07:01 <mhausenblas> yey! redstore runs now!

17:07:13 <mhausenblas> Anchakor, all forgotten and forgiven :D

17:07:46 <mhausenblas> thanks again for the help here - using sudo port install redland saved my day ;)

17:13:19 <mhausenblas> enjayhch still around?

17:14:09 <mhausenblas> I have redstore up and running now, however http://localhost:9999/query gives me a:

17:14:11 <mhausenblas> XML Parsing Error: not well-formed

17:14:11 <mhausenblas> Location: http://localhost:9999/query

17:14:11 <mhausenblas> Line Number 12, Column 19:

17:14:17 <mhausenblas> any idea why?

17:15:36 <jmv> yvesr, I ask for help on a private window

17:15:37 <mischat> perhaps he is at the yorkshire grey already :)

17:15:50 <mhausenblas> same issue when I load data into the store

17:16:27 <mhausenblas> mischat, you're familiar with redstore to this extend?

17:16:33 <mischat> nope

17:16:38 <mischat> nick released it today :)

17:16:49 <mhausenblas> hehe, good answer ;)

17:17:04 <mhausenblas> ok, gotta run soon, hope he gets the messages l8er

17:17:30 <mhausenblas> I loaded http://events.linkeddata.org/ldow2010/ into redstore via http://localhost:9999/load

17:17:40 <mhausenblas> then visited http://localhost:9999/data

17:17:44 <mhausenblas> result is:

17:17:50 <mhausenblas> XML Parsing Error: mismatched tag. Expected: </a>.

17:17:51 <mhausenblas> Location: http://localhost:9999/data

17:17:51 <mhausenblas> Line Number 11, Column 114:

17:18:07 <gol> serve it as text/html

17:18:17 <gol> to disable shelly-beige thing

17:18:22 <mhausenblas> phenny, tell enjayhch see http://chatlogs.planetrdf.com/swig/2010-02-12.html#T17-13-19

17:18:22 <phenny> mhausenblas: I'll pass that on when enjayhch is around.

17:18:47 <mhausenblas> goi, yes, but I don't want to tinker around with redstore, I want to use it ;)

17:23:22 <enjayhch> mhausenblas: back

17:23:22 <phenny> enjayhch: 17:18Z <mhausenblas> tell enjayhch see http://chatlogs.planetrdf.com/swig/2010-02-12.html#T17-13-19

17:23:50 <mhausenblas> heya, cool

17:23:57 <mhausenblas> any idea? any patches? :)

17:24:03 <enjayhch> not gone to Yorkshire Grey yet

17:24:19 <mischat> :)

17:24:34 * mhausenblas wondering what Yorkshire Grey is :P

17:24:59 <enjayhch> Yorkshire Grey is a pub close to the BBC ;-)

17:25:26 <enjayhch> so you are trying to load "http://events.linkeddata.org/ldow2010/" as RDF?

17:25:37 <enjayhch> oh it is RDFa

17:25:54 <enjayhch> aha

17:26:03 <mhausenblas> well, yeah

17:26:05 <enjayhch> so rapper "http://events.linkeddata.org/ldow2010/" has errors too

17:26:17 <mhausenblas> but also some RDF/XML

17:26:18 <enjayhch> looks like you have to manually tell it that it is RDFa

17:26:22 <mhausenblas> same result

17:26:28 <mhausenblas> re the /query page

17:26:57 <enjayhch> when I run rapper "http://events.linkeddata.org/ldow2010/"

17:27:02 <enjayhch> it only find two triples

17:27:36 <mhausenblas> ok, let's forget about RDFa for the moment ;)

17:28:46 <mhausenblas> I'm using now http://sw.deri.org/~aidanh/foaf/foaf.rdf

17:28:51 <mhausenblas> in http://localhost:9999/load

17:29:04 <mhausenblas> loads 136 triples, ok?

17:29:17 <mhausenblas> then I go to http://localhost:9999/query

17:29:20 <Anchakor> yeah rapper expects data to be rdf/xml unless stated otherwise

17:29:33 <mhausenblas> and ... again ...

17:29:33 <mhausenblas> XML Parsing Error: not well-formed

17:29:34 <mhausenblas> Location: http://localhost:9999/query

17:29:34 <mhausenblas> Line Number 12, Column 19:PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>

17:29:34 <mhausenblas> ------------------^

17:29:39 <enjayhch> yup, got 136 triples

17:30:07 <enjayhch> eeek

17:30:12 <mhausenblas> and when you go to http://localhost:9999/query ... (FF3.5 on MacOS)

17:30:15 <mhausenblas> ah!

17:30:16 <mhausenblas> ;)

17:30:33 <mhausenblas> then visit http://localhost:9999/data

17:30:34 <enjayhch> yeah, it is invalid HTML

17:30:38 <mhausenblas> same there

17:30:40 <enjayhch> works in Safari

17:30:40 <mhausenblas> right

17:30:51 <enjayhch> will fix that ASAP!

17:30:53 <mhausenblas> ah, gotta check now in Chrome and Safari

17:30:54 <mhausenblas> thanks!

17:31:10 <mhausenblas> same in Chrome

17:31:23 <mhausenblas> yup, works in Safari

17:31:39 <enjayhch> was going to add XML/HTML validation to the test suite last night

17:31:40 <mhausenblas> enjayhch, have you been testing in Safari (only) right? :P

17:31:45 <enjayhch> but went to bed instead

17:31:52 <mhausenblas> but beside this: great stuff

17:31:57 <enjayhch> mhausenblas: you guess correctly :-P

17:32:09 <mhausenblas> as soon as this is fixed I can advise our project partners to use it

17:32:20 <enjayhch> crieky!

17:32:31 <mhausenblas> they were waiting for exactly this and otherwise I'd need to look into this :D

17:32:38 <enjayhch> :)

17:32:46 <mhausenblas> so, I guess I owe you a beer? or two?

17:32:55 <mhausenblas> next time I'm in London, promise ;)

17:33:40 <mhausenblas> ok, gotta run now - will have an eye on the svn and svn up when I see something

17:33:56 <mhausenblas> quick (last) question, enjayhch

17:33:56 * enjayhch adds HTML escaping to his todo list

17:34:20 <mhausenblas> after I've done svn up - do I have to make/make install again, right?

17:34:43 * mhausenblas *is* a n00b, told ya

17:34:45 <enjayhch> yeah

17:34:56 <enjayhch> or ./src/redstore if you don't want to install

17:35:05 <mhausenblas> yeah, I'm a n00b or yeah make/make install ? :)

17:35:21 <mhausenblas> nah, that's fine, prefer the source from svn

17:35:38 <enjayhch> I run "make && ./src/redstore" when I am developing

17:36:12 <mhausenblas> right

17:36:58 <mhausenblas> ok, thanks a lot again -TTYL

17:39:28 <enjayhch> oh dear, lots of HTML problems

18:03:56 <melvster> mhausenblas: great blog post

18:04:10 <melvster> id say yes, google are becoming part of the LOD

18:05:09 <melvster> format could be tweaked sure, but it's more the statement of intent, once you open up and put the data out there it can be improved

18:24:13 <dajobe> enjayhch: I have 'check' installed (on debian) but make check for redstore fails with: /bin/sh: checkmk: command not found

19:02:22 <kennyluck>http://richard.cyganiak.de/2007/10/lod/

19:02:24 <dc_swig> J: http://richard.cyganiak.de/2007/10/lod/ from kennyluck

19:03:36 <kennyluck> J:| LOD Cloud Diagram No Longer Maintained?

19:03:38 <dc_swig> Titled item J.

19:04:01 <kennyluck> J: Last updated: 2009-07-14

19:04:02 <dc_swig> Added comment J1.

19:04:29 <kennyluck> J: for example, NYTimes is not in the diagram

19:04:31 <dc_swig> Added comment J2.

19:05:07 <kennyluck> J: kind of sad...I like this diagram so much...

19:05:09 <dc_swig> Added comment J3.

19:06:59 <gol> golllllllllllll (Larry beat Alinghi) ORacle ftw

19:07:06 <gol> gol is now known as hock

19:36:45 <Shepard`> Shepard` is now known as Shepard

19:55:38 <kasei> kennyluck, I have no idea, but would guess it's simply a matter of the LOD diagram getting harder and harder to produce as the diagram gets bigger.

20:00:02 <Anchakor> there should be a way to generate it automatically, or it isn't really "linked data" diagram :)

20:01:34 <kasei> the problem is that the automatically generated ones look awful.

20:02:43 <Anchakor> is it really just a layout problem?

20:07:53 <kasei> i don't know. but i seem to recall some early auto-generated diagrams looking really bad. so cygri gets lots of kudos for for his work, even if it's not always current.

20:08:56 <dajobe> computers have no sense of taste or aesthetics. see also, search results

20:09:38 <Anchakor> I wonder if all datasets there use voiD and link to it from the data

20:09:57 <kasei> very unlikely

20:10:57 <Anchakor> data where I can't use follow-my-nose method aren't really useful

20:16:12 <hock> how about charts like this

20:16:14 <hock> http://s3.blog.darkhax.com/uploads/2010/01/bad-middleware.png

20:49:46 <ktk> ktk is now known as ktkNA

22:20:14 <DanC> oh yeah... coherent logic... thanks for the reminder, jmv

22:21:42 <DanC> folk.uio.no/rantonse/acl/acl.pdf -

22:24:00 <DanC> sigh... " An upcoming application area

22:24:00 <DanC> is the semantic web [1], with its vision that future web content will be marked up with semantical

22:24:00 <DanC> information.

22:24:00 <DanC> "

22:24:23 <DanC> it's more about putting databases and spreadsheets into the web than about marking up web content

22:24:34 <DanC> well, maybe a mix

22:32:26 <DanC> darn; this is a research plan, not a result

22:32:51 <jmv> well , I don't know

22:33:22 <jmv> Marc Bezem is active these times on the SWI Prolog list, I can ask .

22:33:52 <jmv> anyway Jos took some , if not most of the concepts

22:34:12 <DanC> I tried reading Jos's code... got lost pretty quickly

22:34:37 <DanC> Jos has tried to explain Euler to me several times... I just don't seem to get it.

22:35:15 <jmv> yes , me too ; the project in "Automating Coherent Logic" is about doing in C++ for even more speed , but it's already done in Prolog

22:35:38 * DanC finds a CL prover in prolog http://www.cs.vu.nl/~diem/research/ht/#tool

22:35:42 <jmv> I like him , but he's laconic

22:35:53 <DanC> which him?

22:36:03 <jmv> Jos

22:36:29 * DanC looks up laconic

22:37:02 <jmv> The adj laconic has 1 sense (no senses from tagged texts)

22:37:02 <jmv>

22:37:02 <jmv> 1. crisp, curt, laconic, terse -- (brief and to the point; effectively cut short; "a crisp retort"; "a response so curt as to be almost rude"; "the laconic reply; `yes'"; "short and terse and easy to understand")

22:37:09 <jmv> from WordNet

22:38:46 <dmiles_afk> one of the best prolog theorem provers i hae found is called xray: http://www.cs.uni-potsdam.de/wv/xray/

22:38:58 <DanC>http://lists.w3.org/Archives/Public/public-cwm-talk/2009JanMar/0006.html

22:38:59 <dc_swig> K: http://lists.w3.org/Archives/Public/public-cwm-talk/2009JanMar/0006.html from DanC

22:39:04 * dmiles_afk exampines CL.pl though

22:39:11 <DanC> K:|Re: Optimizing for recursive rules

22:39:13 <dc_swig> Titled item K.

22:39:18 <DanC> K:discussion of coherent logic

22:39:20 <dc_swig> Added comment K1.

22:39:25 <DanC> best in what way, dmiles_afk ?

22:39:55 <DanC> CL.pl looks like line noise to me. sigh.

22:40:00 <dmiles_afk> as far as soundness.. it cuts off negation by failure paths.. one has to use explicit negation when they think they want negation

22:40:09 <DanC> I could perhaps read it if I tried really hard... but...

22:40:42 <dmiles_afk> i when read Cl.pl .. iot looks like a file convertor

22:40:46 <jmv> one must first read the above PDF ...

22:40:54 <DanC> ... as sussman says, the point of computer programs should be primarily to communicate with other programmers, and only secondarily to communicate with the machine

22:41:04 <jmv> which I never had the time .

22:41:17 <DanC> which pdf?

22:41:18 <jmv> sure

22:41:30 <jmv> the one you pasted

22:41:39 <jmv> Research Proposal:

22:41:39 <jmv> Automating Coherent Logic

22:41:39 <jmv> Marc Bezem∗ Thierry Coquand† Arild Waaler‡

22:41:52 <DanC> but... as I said, that's a plan, not a result.

22:42:02 <DanC> it didn't even have a clear explanation of CL

22:42:07 <jmv> ha sorry , there is another longer ...

22:42:16 <dmiles_afk> DanC: well when i say best .. as in i got prarbly the best speed of soundness/completeness

22:42:20 <jmv> 1mn

22:42:34 <dmiles_afk> but i only audited about 3 of them and that was in 2002

22:42:53 <dmiles_afk> i really should have written something up :(

22:43:05 <DanC> soundness... in a prover... er... I'd expect 100% soundness or the thing is a complete waste.

22:43:44 <DanC> incompleteness is tolerable; unsoundness is not. otherwise it a guesser, not a prover ;-)

22:43:55 <dmiles_afk> heh.. yeah that is true ;) but often people prefer high hits low semantics.. instead of high semantics and low hits

22:44:15 <DanC> yeah... guessers have their place (to wit: google)

22:44:39 <dmiles_afk> yeah thats a good example of what i meant

22:45:18 <dmiles_afk> well in xray it has more potential then others for - trueth maintenence before assertion

22:45:50 <dmiles_afk> meaning befire a user can add to it.. it can go through a vetting system like cyc's

22:46:12 <dmiles_afk> but more inportantly a consequent of any rule .. must go thru a simular vetting process

22:46:31 <jmv> another longer report by Bezem on Coherent logic (look at the end ) http://folli.loria.fr/cds/2006/courses/Bezem.Nivelle.IntroductionToAutomatedReasoning.pdf

22:46:54 <dmiles_afk> even though the rule of the valid was valid.. later on data might come in that maks the consequent of that rule not useable

22:47:15 <dmiles_afk> even though the rule was valid.. later on data might come in that makes the consequent of that rule not

22:47:51 <dmiles_afk> ussualyl argument domain/range assertions help.. but they are not always enough

22:49:20 <DanC> I don't see Coherent Logic in there, jvm

22:49:23 <DanC> which section?

22:49:52 <dmiles_afk> but the mistake mnost provers makes is: sometimes such assertions like domain/domainSubclass of a predicate might not propigate the the remove litterals.. yet they were once incorced in the Prenex form of the rule.. but lost in the disjunctive normal form

22:50:09 * dmiles_afk rettypes that line...

22:50:18 <DanC> man... jos's prolog code looks similarly full of puctuation and devoid of comments and whitespace (http://eulersharp.sourceforge.net/2006/02swap/euler.yap )

22:50:38 <DanC> whitespace is not that expensive, folks! I suppose comments cost some effort

22:51:21 <dmiles_afk> but the mistake most provers makes is: assertions like domain/domainSubclass of a predicate might not propigated as enforced into the entailments.. yet they were once incorced in the Prenex form of the rule.. some of the antecedent type enforment is lost in the disjunctive normal form

22:51:35 <dmiles_afk> incroperateed

22:52:00 * dmiles_afk thinks a paper about it is better than irc.. of his topic

22:52:12 <jmv> DanC, it's 2 texts concatenated, look at page 80

22:53:34 <DanC> ah. thanks.

22:53:45 <dmiles_afk> the problem one has when writting prolog is that if you dont compress it enough.. you have to hit the scroll on your mouse wheell more often while workign on it

22:54:27 <dmiles_afk> so between interdependant functionality you at most want to scroll no more than 5 times

22:55:26 <DanC> I like this bit: " In resolution logic one reduces

22:55:27 <DanC> a reasoning problem T |= φ to cl(T ∧ ¬φ) |= ⊥, where cl stands for a clausifica-

22:55:27 <DanC> tion operation. The latter problem is not equivalent to the former, but the two

22:55:27 <DanC> problems are ‘equisolvable’ in the sense that the former is solvable if and only

22:55:27 <DanC> if the latter is refutable by resolution.

22:55:27 <DanC> "

22:55:41 <DanC> skolemization is icky

22:56:20 <DanC> wild... I didn't realize it relies on the/an axiom of choice

22:56:54 <dmiles_afk> when i see that i ussually assume it s becasue he doesnt want to lookl at it any more.. "glad that hell is over".. / the code is perfect.. and doent need to be

22:56:57 <DanC> heh... " Regrettably,

22:56:57 <DanC> your automated reasoning assistant is working on a different problem than you

22:56:57 <DanC> and you are not able to help when it gets stuck.

22:56:57 <DanC> "

22:58:17 <dmiles_afk> (doesnt want to scroll down an extra 60 lines) but at that point that code shuld have been moved to the bottem of the file

23:01:56 <DanC> odd... "In order to keep things as simple as possible we restrict attention to one-sorted

23:01:56 <DanC> first-order logic without function symbols."

23:02:34 <DanC> (eigen)variables

23:02:42 <DanC> ^ a gap in my education

23:03:00 <DanC> oh... good... " The completeness proof can be gen-

23:03:00 <DanC> eralized to the case with function symbols.

23:03:00 <DanC> "

23:04:41 <DanC> thanks a bunch, jmv, this paper is just my speed... now to test my understanding by coding up Definition 1 in scala..

23:04:51 <dmiles_afk> what they are secretly as like skolems.. they remain variables.. though for mechanics

23:05:20 <dmiles_afk> but they are "dressed up" with meaning

23:05:43 <dmiles_afk> as the system runs i believe those variables get more meaning attached

23:05:58 <dmiles_afk> intead of trying to give them first class values

23:06:03 <jmv> :) reading too

23:06:06 <dmiles_afk> instead*

23:07:02 <DanC> by the way... I've got rdflogic (RDF simple entailment) layered on top of exiconj (existential conjuctive logic, aka deductive database), layered atop propcalc in http://bitbucket.org/DanC/swap-scala/src/tip/src/main/scala/

23:07:56 <jmv> I'll have a look DanC

23:07:59 <DanC> and I've got complog (milawa's computational logic...well, part of it) layered on foleq (first order logic with equality) in that same directory

23:08:26 <DanC> I'm afraid my code is light on comments too

23:08:48 <Phurl__> Phurl__ is now known as Phurl

23:09:56 <jmv> well, at least you're on IRC :) ... I recently learned a bit of Haskell, but I'm ignorant of Scala yet

23:10:31 <DanC> scala mixes in the ML worldview with java. it's pretty cool

23:10:38 <DanC> it's even got some lazy constructs

23:10:56 <DanC> .g fun and frustration with scala

23:10:57 <phenny> DanC: http://www.advogato.org/person/connolly/diary/71.html

23:11:58 <dmiles_afk> so there are some tings in scala one can do that would be a pain in striaght java?

23:12:14 <DanC> yes. everything

23:12:15 <DanC> ;-)

23:12:21 <dmiles_afk> ok good!

23:12:34 <DanC> i.e. java is so painful I've never used it. but I like scala

23:12:36 <darkthing> Heh. I've just noticed a conference I'm off to in a couple of weeks' time generates its schedule web pages from RDF data.

23:12:38 <darkthing>http://dev8d.org/programme.html

23:12:40 <dc_swig> L: http://dev8d.org/programme.html from darkthing

23:12:58 <dmiles_afk> i been mirgrating .java code o C# becasue of some things.. but the results i got i still had fixnum boxing

23:13:19 <DanC> L: wow! something RDF-related that looks done by a competent designer!

23:13:21 <dc_swig> Added comment L1.

23:13:34 <dmiles_afk> scala is one of the first jvm langs to at least consider how to solve the problem

23:13:44 <DanC> L: though... darn... in-your-face-URLs for the source data

23:13:45 <dc_swig> Added comment L2.

23:14:29 <dmiles_afk> the only sadness i have with a jvm.. is when i watn my data maintained in the jvm.. i have to marshel it in/out of swi prolog

23:14:47 <dmiles_afk> the only sadness i have with a jvm.. is when i want my data maintained in the jvm.. i have to marshall it in/out of swi prolog

23:15:15 <dmiles_afk> i need something in the jvm that can do swi-prolog stuff as effeciently as swi prolog does.. that shouldnt be asking too much

23:15:16 <DanC> there's no good prolog for the jvm?

23:15:22 <DanC> oh

23:15:28 <DanC> actually, that's asking quite a bit.

23:15:33 <DanC> swi-prolog is highly engineered

23:15:45 <dmiles_afk> yeah there are a few that are actually pretty good.. but just in comaraison

23:15:52 <dmiles_afk> comparison

23:16:01 <DanC> though I guess the warren abstract machine is the bulk of the trick, yes? (my knowledge here is pretty thin)

23:17:03 <DanC> logger, pointer?

23:17:03 <DanC> See http://chatlogs.planetrdf.com/swig/2010-02-12#T23-17-03

23:17:11 <dmiles_afk> if it was more the machine model.. then Java would be able to do it just a good.

23:17:33 <jmv> Jos tried over the years to find a decent Prolog in Java ... I think he gave up

23:17:38 <dmiles_afk> when the java prologs do the machine like JavaBinProlog

23:18:05 <dmiles_afk> indeed the wam based java onces do the best

23:18:11 <dmiles_afk> indeed the wam based java ones do the best

23:19:07 <jmv> which wam based java exist ?

23:19:10 <DanC> he sounded happy with yap... is that java-based?

23:19:17 <jmv> no

23:19:30 <dmiles_afk> jmv: JinniProlog

23:19:54 <DanC> nope... "Portability: The whole system is now written in C. " -- http://www.dcc.fc.up.pt/~vsc/Yap/

23:20:11 <dmiles_afk> i think this is the only wam based one: http://www.binnetcorp.com/download/jinnidemo/index.html

23:21:09 <dmiles_afk> i decompiled his code.. and he converts everything to int pointers and does wam stuff with them

23:21:09 <kwijibo_> kwijibo_ is now known as kwijibo

23:21:10 <jmv> alas not open source

23:21:24 <dmiles_afk> well it is.. he left it public on his teaching sites for students

23:22:10 <dmiles_afk> i decompiled it before i found it:( but the problem was i neded more SWI compat

23:22:58 <dmiles_afk> i been working witjh a guy who is trying to make a very effecicint java prolog.. only 5 times slower than swi prolog at this point.. but he giving up alot of prolog

23:23:44 <dmiles_afk> when he gets done.. i hope to make it do all the SWI-prolog stuff w/o losing the effiency gains he made

23:23:49 <jmv> maybe it would be good to write a WAM in Scala, with a prolog compiler in Prolog

23:24:52 <dmiles_afk> one good prolog -in prolog "compiler" can be gleaned from JProlog

23:25:29 <dmiles_afk> well it emits .java code.. but it is based on exmapling what was once a wam engine

23:25:45 <dmiles_afk> just written out inlined style in the java ops

23:26:19 <dmiles_afk> so like putvar x,y .. is PrologLib.putVar(x,y)

23:27:11 <dmiles_afk> i guess it also then qualifies as wam and should have includeded it above

23:28:04 <dmiles_afk> but its possible idea with scala/prolog pair

23:29:49 <jmv> :)

23:30:21 <dmiles_afk> "Prolog Cafe is a Prolog-to-Java source-to-source translator system. Prolog programs are first translated into Java programs via the WAM (Warren Abstract Machine), and then those programs are compiled by a usual Java compiler such as SUN's JDK SE"

23:30:47 <dmiles_afk> thats the one

23:31:27 <dmiles_afk> i ported that codebase to emit it prolog into LarKC object systemn

23:31:35 <DanC> there you go... Defn 1 done: http://bitbucket.org/DanC/swap-scala/src/tip/src/main/scala/coherent.scala

23:31:50 <dmiles_afk> well at he time its was just called researchcyc :)

23:32:21 <jmv> DanC , how quick !!

23:32:26 <dmiles_afk> the results i got was nice and usefull.. the problem still was the 10-20 times slower than SWI-prolog

23:34:19 <jmv> and SWI is reputed slower than Yap !

23:34:28 <dmiles_afk> yeah :(

23:34:37 <jmv> I'm installing scala eclipse plugin

23:34:53 <dmiles_afk> so i whent back to marchalling

23:35:21 <dmiles_afk> marshalling over the JNI interface again

23:35:53 <dmiles_afk> (which is lame becasue every triple and constant is held in ram both places)

23:36:45 <dmiles_afk> the next possiblity is the make the SWI-prolog (or Yap-prolog) the authorative dataholder. and use SWIG to let java touch them

23:37:20 <dmiles_afk> thats still marshalling but per object data access

23:37:39 <dmiles_afk> so it saves RAM only

23:38:09 <jmv> JPL is not convenient to use , it never finds the .so

23:38:33 <dmiles_afk> its a pain .. you pretty much have to have SWI-prolog in your lib/ dir of the app

23:38:44 <dmiles_afk> well kinda sorta

23:38:57 <dmiles_afk> it can actually be made to work ;)

23:39:55 <dmiles_afk> one thing funky.. is LarKC to load my KB is 12gb... in SWI-prolog its 4gb

23:40:13 <dmiles_afk> 16gb t use my currnt marshelling

23:41:01 <dmiles_afk> (my machine only has 8gb physical)

23:41:17 <jmv> funny yes , SWI is mostly Jan Wielemaker , while LarKC is a lot a money from European Community !!!

23:42:58 <dmiles_afk> i ported larkc to C# and got it running. i was hoping to somehow trim down memory consumption

23:43:31 <dmiles_afk> but now trying to find another target.. i was thinking of scala.. still not sure

23:43:54 <dmiles_afk> techically the .java port should be suffuciently still a best case

23:44:42 <dmiles_afk> what scala might bring.. is a SubLCons that can be headed by a fixnum at a smaller consumtiotn rate

23:45:00 <dmiles_afk> instead of keeping a boxed value in the CAR

23:45:25 <dmiles_afk> but if someone setCar(myCons"String") its all over

23:45:42 <dmiles_afk> but if someone setCar(myCons,"String") its all over .. the unbderlying holder is insuffiencent

23:46:36 <dmiles_afk> but the cons is the underlying prolbem for the memory consumption

23:47:06 <dmiles_afk> the next problem then still is when java is trying to do basic prolog style unification

23:47:27 <dmiles_afk> its just slwer than a hand written C prolog

23:47:50 <dmiles_afk> larkc workarround it by working smarter .. not so hard to solve the same types of problems

23:48:29 <dmiles_afk> it uses ton of heuristics and never is forced into prolog style theorem proving

23:49:04 <dmiles_afk> but many times a prolog-style is a good idea.. why i bewen using JPL

23:49:42 <dmiles_afk> but if i had my perfect world i'd use a fully jvm solution

23:50:22 <dmiles_afk> and also still have to fix the (a b c d) not using 4 linked lists

23:50:52 <dmiles_afk> to use a prolog style a,3+[b,c,d]

23:51:12 <dmiles_afk> 'a/3'+[b,c,d]

23:51:18 <dmiles_afk> as array

23:51:26 <jmv> or maybe do some stuff in C and for the rest do native Java compilation with gcj and link all

23:52:14 <dmiles_afk> oh and the gcj port .. is actually 1.2 the speed of the sun jitted version.. this is only true the last year or two

23:52:41 <dmiles_afk> erm say 20% slower

23:53:03 <dmiles_afk> which always thoughted been 2 times faster

23:53:22 <dmiles_afk> but for code design reasons the gcj is still smart idea

23:53:46 <darkthing> You mean gcj is slower than Sun JIT?

23:54:03 <dmiles_afk> yup.. not by much. but sun jit will always win

23:54:33 <darkthing> Ah, your first sentence on the subject sounded like it produced faster code.

23:54:50 <dmiles_afk> same with ExcilsorJIT (commerical ahead of time compiler)

23:54:55 <dmiles_afk> same with ExcilsorJET (commerical ahead of time compiler)

23:55:16 <dmiles_afk> it was the same a GCJ.. just a tad sloewer then Just-in-time

23:55:44 <dmiles_afk> Just-in-time is better than Ahead-of-time .. which is often counter intuitive

23:56:14 <dmiles_afk> oh but i should qualify only recently has Sun made their JIT faster

23:56:15 <darkthing> Interesting. Why is that?

23:57:26 <dmiles_afk> well one answe i've heard is useually AOT can see everyting that can happen enough to optimize.. and when AOT compiler is used its used on a system that cant have furthre JIT love applied

23:57:46 <dmiles_afk> well one answer i've heard is useually AOT can not see everyting that can happen enough to optimize.. and when AOT compiler is used its used on a system that cant have furthre JIT love applied

23:58:01 <darkthing> Sounds like the AOT optimiser is only doing half its job.

23:58:39 <dmiles_afk> generally.. and AOT in mono for instance VETOs further JITing

23:58:59 <dmiles_afk> but back to GCJ.. they just need some JITing as well

23:59:33 <dmiles_afk> ExcelesorJET actually does a ton more AOT (whole program0 that GCJ donet

23:59:53 <jmv> sorry AOT==?

23:59:58 <dmiles_afk> ahead-of-time


The IRC chat here was automatically logged without editing and contains content written by the chat participants identified by their IRC nick. No other identity is recorded.

Alternate versions: RDF Resource Description Framework Metadata and Text

Provided by Dave Beckett as part of Planet RDF