datanucleus / datanucleus-mongodb Goto Github PK
View Code? Open in Web Editor NEWDataNucleus support for persistence to MongoDB datastores
DataNucleus support for persistence to MongoDB datastores
Add a property datanucleus.mongodb.replicaSetName
Refer to datanucleus/datanucleus-core#134
Since MongoDB allows JSON format, we can in principle support nested embedded collections (and maps).
Some methods are now deprecated.
We could potentially allow
pm.newQuery("MongoDB", "db.MYCOLL.findOne({'_id':'" + oid + "'})");
em.createNativeQuery("db.MYCOLL.findOne({'_id':'" + oid + "'})", MyClass.class);
Obviously this would have to only support a subset of the possible MongoDB API calls possible from DB.getCollection(...). So find, and findOne. Similarly the returned DBObject's would need unmapping to return the Object[] for a row of results.
The alternative is already available via NucleusConnection
I'm currently migrating from DN core 5.0.8 to 5.1.8 and datanucleus-mongodb
5.0.4 to 5.1.0-release. I get the following exception when trying to read an object containing a reference to a sub-object:
.. Caused by: java.lang.NullPointerException at org.datanucleus.store.mongodb.MongoDBUtils.getClassNameForIdentity(MongoDBUtils.java:216) at org.datanucleus.store.mongodb.MongoDBStoreManager.getClassNameForObjectID(MongoDBStoreManager.java:190) at org.datanucleus.ExecutionContextImpl.getClassNameForObjectId(ExecutionContextImpl.java:3741) at org.datanucleus.ExecutionContextImpl.getClassDetailsForId(ExecutionContextImpl.java:3635) at org.datanucleus.ExecutionContextImpl.findObject(ExecutionContextImpl.java:3502) at org.datanucleus.identity.IdentityUtils.getObjectFromIdString(IdentityUtils.java:474) at org.datanucleus.store.mongodb.fieldmanager.FetchFieldManager.getValueForContainerRelationField(FetchFieldManager.java:968) at org.datanucleus.store.mongodb.fieldmanager.FetchFieldManager.fetchNonEmbeddedObjectField(FetchFieldManager.java:735) at org.datanucleus.store.mongodb.fieldmanager.FetchFieldManager.fetchObjectField(FetchFieldManager.java:683) at org.datanucleus.state.StateManagerImpl.replacingObjectField(StateManagerImpl.java:1953) at de.mpg.ipp.codac.business.schema.teamwork.UserGroup.dnReplaceField(UserGroup.java) at de.mpg.ipp.codac.business.schema.common.W7XDbNamedObject.dnReplaceFields(W7XDbNamedObject.java) at org.datanucleus.state.StateManagerImpl.replaceFields(StateManagerImpl.java:4302) at org.datanucleus.state.StateManagerImpl.replaceFields(StateManagerImpl.java:4327) at org.datanucleus.store.mongodb.MongoDBUtils$1.fetchFields(MongoDBUtils.java:796) at org.datanucleus.state.StateManagerImpl.loadFieldValues(StateManagerImpl.java:3719) at org.datanucleus.state.StateManagerImpl.initialiseForHollow(StateManagerImpl.java:383) at org.datanucleus.state.ObjectProviderFactoryImpl.newForHollow(ObjectProviderFactoryImpl.java:113) at org.datanucleus.ExecutionContextImpl.findObject(ExecutionContextImpl.java:3194) at org.datanucleus.store.mongodb.MongoDBUtils.getObjectUsingApplicationIdForDBObject(MongoDBUtils.java:791) at org.datanucleus.store.mongodb.MongoDBUtils.getPojoForDBObjectForCandidate(MongoDBUtils.java:753) at org.datanucleus.store.mongodb.query.LazyLoadQueryResult.getNextObject(LazyLoadQueryResult.java:324) at org.datanucleus.store.mongodb.query.LazyLoadQueryResult$QueryResultIterator.next(LazyLoadQueryResult.java:449) at de.mpg.ipp.codac.persistence.DataNucleusProvider.doGetObjects(DataNucleusProvider.java:362) ... 30 more
I'm testing against MongoDB 3.0.15. I'll provide a test case demonstrating the issue and attached 2 test documents for the test case.
DN.zip
I'm not sure if this was a conscious decision for efficiency, or if this is a bug, but I found it surprising.
Suppose I have the following entity classes:
I have the following entity classes (only showing relevant portions):
@Entity
public class TestEntity implements Serializable {
@OneToMany(mappedBy = "testEntity")
private List<RelatedListElementEntity> relatedListElements;
}
@Entity
public class RelatedListElementEntity implements Serializable {
@ManyToOne
private TestEntity testEntity;
}
When I persist RelatedListElementEntity I expect to see something like:
{
"_id" : "8a998109363148060136314806f30002",
"testEntity" : "8a998109363148060136314806f30001"
} ,
{
"_id" : "8a99810936314806013631484c590003",
"testEntity" : "8a998109363148060136314806f30001"
}
When I persist TestEntity I expect to see something like:
{
"_id" : "8a998109363148060136314806f30001",
}
The RelatedListElementEntity looked good, but for TestEntity I see:
{
"_id" : "8a998109363148060136314806f30001",
"relatedListElements" : [ "8a998109363148060136314806f30002", "8a99810936314806013631484c590003" ],
}
(the list elements have been persisted into the document AND into the "owning" property).
Is this by design or a bug? If a bug, I could look into correcting it. It seems StoreFieldManager is the correct place to look, and we may want to add some checks into storeObjectField() to prevent this from being stored. One thought is to add some more checks into isStorable(), (which is called by storeObjectField()).
What do you think?
Refer to datanucleus/datanucleus-core#135
Would provide an alternate to just having a string id form, and potentially allows queries involving the related object
When trying to sort a result using a field that is not part of the entity model (perhaps by a typo), when compiling the sort section of the query, the QueryToMongoDBMapper class will eventually reach the processPrimaryExpression method; which calls getFieldNameForPrimary to get the actual name of the field in mongo documents. But since the field isn't registered within the entity contract, the metadata doesn't known it and returns null.
This will cause NucleusException in any case, with an exception message like "Primary Expressions are not supported by this mapper" (since that's the only thing that the super class does), while QueryToMongoDBMapper does support primary expressions. Furthermore, there's a log message that warns the user that the field isn't part of the entity, so the sorting won't take place on the datastore, this suggests that the query compilation should be successfull; but the results won't be ordered.
So I'm unsure if this is the desired behaviour, or if a exception with a clearer message should be thrown or if it should just work. In any case, this is very confusing; I really suggest a little attention to this point.
Thanks
We need to support where the PK is formed with a persistable object. Currently it tries to serialise the related object which is wrong
In org.datanucleus.store.mongodb.MongoDBUtils.getClassNameForIdentity(Object, AbstractClassMetaData, ExecutionContext, ClassLoaderResolver) I get a NullPointerException when it is tried to retrieve the table for an abstract superclass.
Table rootTable = storeMgr.getStoreDataForClass(rootCmd.getFullClassName()).getTable();
getStoreDataForClass comes back with NULL as result for an abstract superclass. I investigated the table of the storeDataManager and it contains only entries for concrete classes, am I doing something wrong in my annotation?
@PersistenceCapable(detachable = "true", identityType = IdentityType.APPLICATION, objectIdClass = Id.class) @Discriminator(strategy = DiscriminatorStrategy.CLASS_NAME) @Inheritance(strategy = InheritanceStrategy.COMPLETE_TABLE) public abstract class SuperClass { ..
Does this plugin support MongoDB over SSL? An example would help.
We currently only allow 2 of the MongoClientOptions, so allow many others.
With these classes:
@PersistenceCapable
public class Hub implements Serializable {
private String name;
private List<Schedule> schedules;
}
And
@PersistenceCapable
@EmbeddedOnly
public class Schedule implements Serializable {
private String day;
@Convert(LocalTimeConverter.class)
private String opens;
@Convert(LocalTimeConverter.class)
private String closes;
}
The converter does not work, the stored document should be a number/Long and not the String field value since there is a converter. The converter code can be found here: https://github.com/cyberquarks/test-jdo-converter/blob/master/src/main/java/mydomain/model/converter/LocalTimeConverter.java
Here's the test case: https://github.com/cyberquarks/test-jdo-converter
The issue affects the latest version of Datanucleus/Datanucleus JDO
As per documentation provided by DataNucleus for MongoDB datastore, I couldn't found any property to setup a authentication database neither a MongoDB Standard Connection String Format mongodb://[username:password@]host1[:port1][,host2[:port2],...[,hostN[:portN]]][/[database][?options]] isn't fully supported. I got to know from above, DataNucleus missed [?options] parameter out of it.
Ultimately mongodb://username:password@host:port/database?authSource=admin
MongoDB Native Connection URL is working fine to connect to database with authentication using extra option authSource=authdatabase natively, which isn't working with DataNucleus.
When we retrieve a List from MongoDB we retrieve it in the exact same order as persisted. That is, it is stored as a Collection in the owner object. If a user puts some @orderby annotation they may want the list reordering.
This applies to ALL non-RDBMS datastores, so creating some generic code would make sense and reorder them in-memory, maybe using a Comparator.
The MongoDB java driver has mapReduce() methods taking in the reduction definition. We need to take in the reduction definition, perhaps as query extension/hint(s).
User has own conversion function for data perhaps, so used as bulk update type thing.
On a general note, with GAE v1 it stored data in a particular mode (with FK type info in elements). In v2 it is now stored in a new mode (with child information in property in the parent. To move from v1 to v2 a migration process would be a particular map function. Again, like a bulk update.
Obviously people more familiar (than me) with map-reduce will likely have their own uses; things that can't be expressed using JDOQL/JPQL maybe
Currently if there is a JPQL FROM (i.e all queries with JPQL) it logs a warning about ignoring the FROM!
If we have
field IN ('val1')
field IN ('val1', 'val2')
field IN (coll) [where coll is a collection of basic MongoDB compatible types]
then we should be able to evaluate this in-datastore.
See associated pull request
MongoDB v3.4+ detects this attempt and throws exceptions when trying to do this, so let's prevent it at source.
MongoDBPersistenceHandler.locateObjects needs implementing efficiently, firstly to determine the DBCollection objects, then for each one do a bulk find
With the current LazyLoadQueryResult class, when the user calls "size()" on the results it will load all results (since that is the only currently supported way of getting the size). An alternative would be to do as we provided for RDBMS, add a method that will execute an additional query returning the count. This will then avoid loading all results of the query
During the lifecycle of a model class a user could do the following things to it
We also have class rename as a possible scenario.
Refer to core issue 128 regarding database creation/deletion
Refer to core issue 19
When we execute a query and use FetchFieldManager, we have to use the constructor for an ExecutionContext (i.e ObjectProvider not yet known). This means that we cannot easily wrap any SCO fields in FetchFieldManager. Consequently we need to call
ObjectProvider.replaceAllLoadedSCOFieldsWithWrappers()
just after the FetchFieldManager process.
See PersistenceNucleusContext.
"DB" is deprecated in MongoDB java driver 3.1+, and people should use MongoDatabase instead. This will involve a lot of method updates.
Refer to https://mongodb.github.io/mongo-java-driver/4.3/driver/getting-started/installation/ which states
"we recommend that new applications depend on the mongodb-driver-sync module."
so that means add this to the POM
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-sync</artifactId>
<version>4.3.1</version>
</dependency>
and the primary classes such as MongoClient, MongoDatabase are in package com.mongodb.client.
When we make use of a persistable class in the MongoDB plugin we currently go direct to "manageClasses", whereas we should simply check for StoreData, and only then go to the Mongo "DB" object to check it.
Currently JPA 1-1 uni relations default to UNIQUE which will screw up MongoDB since the relation is stored in a String field and will equate two rows with "null" as conflicting.
datanucleus-core 3.0 M3 introduces the ability to receive all objects to be deleted in a single call to StorePersistenceHandler.deleteObjects(). This allows us to remove the objects from MongoDB in fewer calls. Mongodb java driver doesn't currently provide a bulk delete method so can't do just yet
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.