Giter VIP home page Giter VIP logo

h2rdf's People

Contributors

npapa avatar

Watchers

 avatar

h2rdf's Issues

h2rdf 0.4

What steps will reproduce the problem?

-->hadoop jar H2RDF_server.jar sampler.SamplerEx input_Path HBaseTable

What is the expected output? What do you see instead?

-->hadoop job is ok.but when import hbase table,the information below showed

--------------------

14/04/01 23:35:32 WARN ipc.HBaseClient: Unexpected exception receiving call 
responses
java.lang.OutOfMemoryError: Java heap space
        at java.lang.reflect.Array.newArray(Native Method)
        at java.lang.reflect.Array.newInstance(Array.java:52)
        at org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:542)
        at org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:289)
        at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
        at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)

-----------



What version of the product are you using? On what operating system?
1.i install hadoop-1.0.1 and hbase-0-92.0 and  zookeeper-3.3.2; 
2.install h2rdf 0.3

Please provide any additional information below.

>

Original issue reported on code.google.com by [email protected] on 1 Apr 2014 at 3:59

H2RDF Jena connection

Hi,
Can you please give us more details about the manipulation of RDF data stored on H2RDF and the jena java Framework (load data through jena, inference) ?
Best Regards,
Hiba

Run time error disabling the map reduce

What steps will reproduce the problem?
1.The current hadoop and hbase configuration is pseudo distributed mode.

2. The code which was said to be run is from partialJoin.Ex2.java which 
contained commented lines which demonstrated the mapreduce operation, which we 
later uncommented for execution.

3. We also commented mapred.fairscheduler.pool configuration lines to disable 
the pool name as we use the default one.

4. When the algorithm is set to 'map reduce' by using 1, we are getting FATAL 
org.apache.hadoop.mapred.Child: Error running 
child:java.lang.noClassDefFoundError: javaewah/EWAHCompressedBitmap

5. The data is generated using 
gr.ntua.h2rdf.examples.HbaseSequentialImportExample in H2RDF client. The 
following code is used for generating:

for (int i = 1; i < 10; i++) {
Node s = Node.createURI("<"+d0_0+"GraduateStudent"+i+">");              
Node p = Node.createURI("<"+rdf+"type"+">");                    
Node o = Node.createURI("<"+arco+"term"+">");                   
Triple triple = Triple.create(s, p, o); 
store.add(triple);
s=Node.createURI("<"+d0_0+"GraduateStudent"+i+">");
p=Node.createURI("<"+arco+"kyotoDomainRelevance"+">");                  o=Node.createURI("\"8
0\"^^<http://www.w3.org/2001/XMLSchema#double>");           triple=Triple.create(s, p, 
o); 
store.add(triple);  
s=Node.createURI("<"+d0_0+"GraduateStudent"+i+">");                 p=Node.createURI("<http:
//www.w3.org/2000/01/rdf-schema#label>");           
o=Node.createURI("<"+d0_0+"Label"+i+">");                       triple=Triple.create(s, p, 
o);                     store.add(triple);              
}

What is the expected output? What do you see instead?
 The query I am using is in partialJoin.Ex2.java: 
String prolog = "PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>"+
"PREFIX arco: <http://www.gate.ac.uk/ns/ontologies/arcomem-data-model.rdf#>" +
"PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>";

String NL = System.getProperty("line.separator") ;
String q = prolog + NL +
    "SELECT ?termInstance ?score ?termLabel " +
    "WHERE   { ?termInstance rdf:type arco:Term . " +
    "?termInstance arco:kyotoDomainRelevance ?score ." +
    "?termInstance rdfs:label ?termLabel . " +
    " }";

As you can see, I am expecting the entire table contents to be viewed but the 
map reduce itself is not working.

What version of the product are you using? On what operating system?

The OS used is RHEL6.0 and the hbase used is 0.92.0 while hadoop is 0.20.205.0 
and zookeper is 3.4.3. We tried to use javaewah 0.5.2 version instead of 0.3.2 
for trying to resolve the run time error. And also added the jar for javaewah 
in hadoop and hbase folders which we are using just in case it was searching 
for the jar in those lib directories and was unable to find it.  

Please provide any additional information below.
I have attached the stacktrace error as well as the Ex2.java program that we 
are using. 

Kindly let us know what exactly we are supposed to do to execute map reduce 
job.  

Original issue reported on code.google.com by [email protected] on 5 Nov 2012 at 3:51

Attachments:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.