Giter VIP home page Giter VIP logo

reactivemongo's Introduction

ReactiveMongo

ReactiveMongo is a scala driver that provides fully non-blocking and asynchronous I/O operations.

Usage

First add the dependencies in your build.sbt.

libraryDependencies ++= Seq(
  "org.reactivemongo" %% "reactivemongo" % "VERSION"
)

Maven Javadocs

See the documentation

Build manually

To benefit from the latest improvements and fixes, you may want to compile ReactiveMongo from source. You will need a Git client and SBT.

From the shell, first checkout the source:

$ git clone [email protected]:ReactiveMongo/ReactiveMongo.git

Then go to the ReactiveMongo directory and launch the SBT build console:

$ cd ReactiveMongo
$ sbt
> +publishLocal

Running tests:

In order to execute the unit and integration tests, SBT can be used as follows.

sbt testOnly

The test environement must be able to handle the maximum number of incoming connection for the MongoDB instance. This must be checked, and eventually updated, using ulimit -n.

CircleCI Test coverage

Reproduce CI build:

To reproduce a CI build, see the Docker tools.

Learn More

See also the samples

reactivemongo's People

Contributors

adrien-aubel avatar aloiscochard avatar analytically avatar atry avatar aurelienrichez avatar avdv avatar baloo avatar bfaissal avatar borice avatar cchantep avatar chemikadze avatar edofic avatar gatorcse avatar imikushin avatar jbgi avatar jtjeferreira avatar konradwudkowski avatar lucasrpb avatar markvandertol avatar nevang avatar neverov avatar oleastre avatar ornicar avatar pchlupacek avatar scala-steward avatar sgodbillon avatar shirishp avatar skirino avatar sullis avatar viktortnk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

reactivemongo's Issues

Make it possible to create geospatial indexes

The api of ReactiveMongo allows making an index with ascending and descending keys, but a geospatial index isn't possible.

The key constructor parameter of reactivemongo.api.indexes.Index takes a List[(String, Boolean)], where the Boolean element is used to define whether the index should be ascending or descending. Having the possibility to have a third option there to define that the index should be geospatial, would be convenient. The Boolean type would then have to be changed to an Enumaration type and the toBSONDocument method in the object IndexesManager has to be changed for the new key type.

SBT

Looking at the README.md now - how do I install SBT?

Unix socket support

Hey

I was wondering if you are planning to add support for unix sockets or if you have an idea where I could implement them myself. I guess the right place would be the core.NodeSet.Node case class as it handles the connection.

Btw. Thanks for your work on this project :)

Best Regards
Thomas

Loading files from GridFS is very CPU intense

I am consuming the enumeratee returned from gridfs.find(...) with the following iteratee:

Iteratee.consumeArray[Byte]

This consuming is burning my cpu... Is there any way to retrieve the file as a whole? If not, there really needs to be one.

[Feature] BSON conversion macros

PlayFramework 2.1 introduced Json Macro Inception - macros for automatically creading Reads and Writes instnances for a given case class.

Since ReadBSON and WriteBSON follow the same principles it should be possible to automate this allowing for such code(format is read and write combined)

case class Foo(bar: String, baz: Int)
implicit val fooBSONFormat = BSON.format[Foo]

I'll give implementing this a go but I can't promise anything

play json macro source here

Something wrong with authentication in dev mode?

I see this a lot in dev mode.. never seen it in prod mode.. I can't save stuff to a database I clearly have permission to save to (i.e. it works 10% of the time)

Dev mode = windows 7 machine
Prod mode = ubuntu

Database: Mongo 2.4.1

[error] application - Unable to log user in. An exception was thrown
reactivemongo.core.commands.LastError: MongoError['not authorized for insert on myMongoDb.Authentications' (code = 1654
4)]
at reactivemongo.core.commands.LastError$.apply(commands.scala:233) ~[reactivemongo_2.10-0.8.jar:0.8]
at reactivemongo.core.commands.LastError$.apply(commands.scala:231) ~[reactivemongo_2.10-0.8.jar:0.8]
at reactivemongo.core.commands.BSONCommandResultMaker$class.apply(commands.scala:61) ~[reactivemongo_2.10-0.8.ja
r:0.8]
at reactivemongo.core.commands.LastError$.apply(commands.scala:231) ~[reactivemongo_2.10-0.8.jar:0.8]
at reactivemongo.core.commands.LastError$.meaningful(commands.scala:242) ~[reactivemongo_2.10-0.8.jar:0.8]
at reactivemongo.api.FailoverBasicCollection$$anonfun$insert$1.apply(collection.scala:303) ~[reactivemongo_2.10-
0.8.jar:0.8]

Logging configuration

Hi! Is that possible to turn set loglevel to INFO? I get this output every second:

app: 18:57:10.052 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MonitorActor - set: a primary is available
app: 18:57:10.053 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MongoDBSystem - single node, update...NodeSet(None,None,Vector(Node(localhost:27017,Vector(MongoChannel([id: 0x62c4afc4, /127.0.0.1:55438 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x4805e9f1, /127.0.0.1:55439 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x7c41f227, /127.0.0.1:55440 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x57e40274, /127.0.0.1:55441 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x3a5d3ac0, /127.0.0.1:55442 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x3ebc312f, /127.0.0.1:55443 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x354124d6, /127.0.0.1:55444 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x47d978ea, /127.0.0.1:55445 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x262f4813, /127.0.0.1:55446 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x64f01d52, /127.0.0.1:55447 => localhost/127.0.0.1:27017],Ready,Set())),PRIMARY,None)))
app: 18:57:10.053 [mongodb-akka.actor.default-dispatcher-3] DEBUG r.core.actors.MonitorActor - set: a primary is available
app: 18:57:10.054 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MongoDBSystem - NodeSet is now NodeSet(None,None,Vector(Node(localhost:27017,Vector(MongoChannel([id: 0x62c4afc4, /127.0.0.1:55438 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x4805e9f1, /127.0.0.1:55439 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x7c41f227, /127.0.0.1:55440 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x57e40274, /127.0.0.1:55441 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x3a5d3ac0, /127.0.0.1:55442 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x3ebc312f, /127.0.0.1:55443 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x354124d6, /127.0.0.1:55444 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x47d978ea, /127.0.0.1:55445 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x262f4813, /127.0.0.1:55446 => localhost/127.0.0.1:27017],Ready,Set()), MongoChannel([id: 0x64f01d52, /127.0.0.1:55447 => localhost/127.0.0.1:27017],Ready,Set())),PRIMARY,None)))
app: 18:57:10.054 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MongoDBSystem - AUTH: nothing to do. authenticationHistory is AuthHistory(List())
app: 18:57:10.054 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MonitorActor - set: a primary is available
app: 18:57:12.052 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MongoDBSystem - ConnectAll Job running...
app: 18:57:12.052 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MonitorActor - set: a primary is available
app: 18:57:14.052 [mongodb-akka.actor.default-dispatcher-3] DEBUG r.core.actors.MongoDBSystem - ConnectAll Job running...
app: 18:57:14.052 [mongodb-akka.actor.default-dispatcher-5] DEBUG r.core.actors.MonitorActor - set: a primary is available

I've tried to configure loglevel application.conf but it doesnt help :(

akka {
  event-handlers = ["akka.event.slf4j.Slf4jEventHandler"]
  loglevel = "INFO"
  stdout-loglevel = "INFO"
  log-config-on-start = on
}

Cursor.toList(upTo) does not limit the mongodb query

Say we have a collection with many documents.

collection.find(BSONDocument()).cursor toList 10

This will effectively return 10 entries. But it will take a long time, and if we look at the mongodb query log, we see all collection documents were fetched.

Sat May 11 10:55:21.363 [conn228] query test.finduptospec cursorid:11858874334452462 ntoreturn:0 ntoskip:0 nscanned:102 keyUpdates:0 locks(micros) r: 64 nreturned:101 reslen:2949 0ms
Sat May 11 10:55:21.424 [conn228] getmore test.finduptospec cursorid:11858874334452462 ntoreturn:0 keyUpdates:0 locks(micros) r:33850 nreturned:14463
0 reslen:4194290 33ms
... and so on

The limit should be passed by the driver to mongodb, but I'm not sure how to achieve that.

lastError has ok=true and err=Some(..) after insert

Currently using ReactiveMongo 0.9-SNAPSHOT.

I'm inserting into a collection. The collection has a unique index on a field. After performing an insert as follows:

collection.insert(myDocument).map{ lastError =>
// success
}.recover{
case lastError => // print the error
}

When I insert a document that violates the unique index the code falls into the recover clause. When I inspect lastError it has the values:
ok=true
err=Some(E11000 duplicate key error index .....)

The comments for LastError state:

  • @param ok True if the last operation was successful

In this case, the last operation was not successful as it has an error. I believe that ok should be false in this case.

Very verbose output when watching capped collection since Scala 2.10

Hello,

I'm watching a capped collection and pushing results to a SSE socket with Play 2.1

Everything is working good but the reactivemongo output is very verbose since I've moved to Play 2.1-RC1 and Scala 2.10.

With the previous Play 2.1-SNAPSHOTs (with Scala 2.9.2) this messages was not appearing in the console.

But a line is printed every second :

OP:Query(34,app.postsMongoCollection,0,0)
3002:false
3003:false
3004:false 
3005:false
3006:false    
3007:false
3008:false
3009:false
....

And it never stops ...

I've tried to add "logger.reactivemongo=WARN" in the application.conf file, but it's not working (it does not seem to be some log messages)

Here is my code :

 def search(filter: String) = Action { 
    //[...]
    val query = QueryBuilder().query(BSONDocument("message" -> BSONRegex(filter, "")))

    //query results asynchronous cursor
    val cursor = collection.find[JsValue](query, QueryOpts().tailable.awaitData)

    //create the enumerator
    val dataProducer = cursor.enumerate

    //stream the results
    Ok.stream(dataProducer through EventSource()).as("text/event-stream")
}

Is it possible to remove this messages?

Thanks,

Loรฏc

Inserting the same document into a collection with Unique Index

I have created a sample project which shows that when you insert a document into a collection with a unique index, the last error result is still ok.

https://github.com/alexanderjarvis/test-unique-index

I tried to create a Functional test around the scenario but it times out on the second request. It would be great if you could fix this too, or explain why.

To test it just run the project and the curl command that is in the README two times. The second time I would expect to see some sort of error from MongoDB about the unique index.

Thanks,
Alex

Promises for find queries with an error don't complete

I found out that Promises for find queries that contain an error don't complete. The query I used was:

val query = BSONDocument("$and" -> BSONDocument("start_date" -> currentDate))
val result = collection.find(query)
result.headOption.onComplete {t => Logger.info("Completed, failed: " + t.isFailure)}

In this small test case the onComplete isn't triggered and therefore dealing with the error isn't possible. For another Promise i see the following debug message in the log:

*** DEBUGGING PROMISE[scala.concurrent.impl.Promise$DefaultPromise@71f68a39] :: requesting: tryComplete (completed? false) with result=Failure(reactivemongo.core.errors.CompleteDBError: MongoError['$and expression must be a nonempty array' (code = 14816)])

It looks like the error isn't propagated back properly to the Promise for headOption.

Odd behavior with BSONDocument.add(..)

I'm using the BSONDocument.add(..) function to add and overwrite certain values of a document. The behavior seems very odd when I insert and then get values from the document. For example:

Here is some sample code:

val doc = BSONDocument(
"_id" ->BSONObjectID.generate,
"value" -> "some value"
)
val newDoc = doc.add(BSONDocument("value" -> "a new value"))
println(BSONDocument.pretty(newDoc))
someMongoCollection.insert(newDoc).map{ lastError =>
println(newDoc.get("value"))
}

// Here is whats printed:

{
_id: BSONObjectID("516e04b97350776006f54a63"),
value: BSONString(some value),
value: BSONString(a new value)
}
Some(BSONString(some value))

// Here is what actually gets saved in MongoDB

{
"_id": { "$oid" : "516E04B97350776006F54A63" },
"value": "a new value"
}

What I would expect is the the add() would replace existing values if they have the same key. Then after successful insert when I call get() I would expect it to return "a new value".

Non-obvious behavior in BSONReader

I have this code

  implicit object MoneyBSONReader extends BSONReader[BSONDocument, Money] {
    def read(doc: BSONDocument): Money = {
      val minor = doc.getAs[Int]("amountMinor").get
      val currency = CurrencyUnit.of(doc.getAs[String]("currency").get)
      Money.ofMinor(currency, minor)
    }
  }

and if something goes wrong (in my case I thought it was Long but not Int) reader just fails silently and returns None. Maybe there should be an exception if there's field with name requested but it can't be converted.

Make Producer.produce public

Currently this method is private. I'd like to use it in my own MongoDAO, like so:

def findHeadOption(elements: Producer[(String, BSONValue)]*)(implicit ec: ExecutionContext): Future[Option[T]] = {
  collection.find(new BSONDocument(
    elements.flatMap { el =>
      el.produce.map(value => Seq(Try(value))).getOrElse(Seq.empty)
    }.toStream).cursor[T].headOption())
}

Exception on BSONBinary with UserDefinedSubtype deserialization

The following code throws a RuntimeException. The bug is here. 0x80 occupies 2 bytes. How BSON protocol serialises the subtype?

import reactivemongo.bson._
import org.jboss.netty.buffer.ChannelBuffers
import java.nio.ByteOrder

val byteArray: Array[Byte] = Array(127, -128, 32, -65) 
val binary = ChannelBuffers.wrappedBuffer(byteArray)
val document = BSONDocument("test" -> BSONBinary(binary, Subtype.UserDefinedSubtype))
val iterator = DefaultBSONIterator(document.toBuffer)
iterator.next

Default FailoverStrategy

I just got bitten by this: the default value for FailoverStrategy is... unexpected.

First, I'm not sure why the strategy applies on the collection rather than a specific request. Then, there are cases (most of the time?) where the fail-fast approach is what the user wants, for example when using a unique index.

Following the principle of least astonishment, I believe that the default FailoverStrategy should have 0 retries.

0.10-SNAPSHOT not available

Hi,

Would it be possible to make a 0.10-SNAPSHOT available with Akka 2.2-M3 integration? That'd be great!

Mathias

killcursors: found 0 of 1

I see a lot of failing killcursors in my mongodb logs:

Wed Feb  6 14:21:05 [conn453] killcursors: found 0 of 1

I don't know if it's because reactive-mongo is using a wrong cursor id or if the cursor was already killed.

Intellij cannot resolve type of BSONCollection

I'm currently using 0.9-SNAPSHOT.

I have code like this:

import reactivemongo.api._
import collections.default.BSONCollection
import reactivemongo.bson._
import play.api.libs.concurrent.Execution.Implicits._

object DBReactiveTest {
val driver = new MongoDriver
val connection = driver.connection( List( "localhost:27017" ) )
val db = connection.db("my_db")
val collection = db.collection("my_collection")

def find(id: String) = {
val query = BSONDocument("_id" -> BSONObjectID(id))
collection.find(query).cursor[BSONDocument]
}
}

This code compiles however intellij cannot detect the type of the collection val. I have to explicitly say:

val collection = db.collection("sessionTokens").as[BSONCollection]

And this will resolve in IntelliJ.

Tailable Cursor on Capped Collection in Replica Set doesn't work as expected

This is reproducible in the reactivemongo-tailablecursor-demo sample project.

Changing the configuration to use a replica set

mongodb.servers = [
  "host1.example.com",
  "host2.example.com"   
]

Whenever a new message is sent the onMessage callback of the websocket is called with the new message(as expected) followed by all the elements currently in the collection including the new message.

For example submitting the following messages in order:

  1. { title: "1", content: "aaa" }
  2. { title: "2", content: "bbb" }
  3. { title: "3", content: "ccc" }
  4. { title: "4", content: "ddd" }
    results in server output of
received {"title":"2","content":"bbb"}
received {"title":"3","content":"ccc"}
received {"title":"4","content":"ddd"}

and html output of

Mon Dec 03 2012 17:18:52 GMT-0700 (MST): {"_id":{"$oid":"50bd416bc9d16682d5f35fe7"},"title":"4","content":"ddd"}

Mon Dec 03 2012 17:18:52 GMT-0700 (MST): {"_id":{"$oid":"50bd415fc9d16682d5f35fe6"},"title":"3","content":"ccc"}

Mon Dec 03 2012 17:18:52 GMT-0700 (MST): {"_id":{"$oid":"50bd4156c9d16682d5f35fe5"},"title":"2","content":"bbb"}

Mon Dec 03 2012 17:18:52 GMT-0700 (MST): {"_id":{"$oid":"50bd413dc9d16682d5f35fe4"},"title":"1","content":"aaa"}

Mon Dec 03 2012 17:18:51 GMT-0700 (MST): {"_id":{"$oid":"50bd416bc9d16682d5f35fe7"},"title":"4","content":"ddd"}

Mon Dec 03 2012 17:18:40 GMT-0700 (MST): {"_id":{"$oid":"50bd415fc9d16682d5f35fe6"},"title":"3","content":"ccc"}

Mon Dec 03 2012 17:18:40 GMT-0700 (MST): {"_id":{"$oid":"50bd4156c9d16682d5f35fe5"},"title":"2","content":"bbb"}

Mon Dec 03 2012 17:18:40 GMT-0700 (MST): {"_id":{"$oid":"50bd413dc9d16682d5f35fe4"},"title":"1","content":"aaa"}

Mon Dec 03 2012 17:18:39 GMT-0700 (MST): {"_id":{"$oid":"50bd415fc9d16682d5f35fe6"},"title":"3","content":"ccc"}

Mon Dec 03 2012 17:18:31 GMT-0700 (MST): {"_id":{"$oid":"50bd4156c9d16682d5f35fe5"},"title":"2","content":"bbb"}

Mon Dec 03 2012 17:18:31 GMT-0700 (MST): {"_id":{"$oid":"50bd413dc9d16682d5f35fe4"},"title":"1","content":"aaa"}

Mon Dec 03 2012 17:18:31 GMT-0700 (MST): {"_id":{"$oid":"50bd4156c9d16682d5f35fe5"},"title":"2","content":"bbb"}

Mon Dec 03 2012 17:18:24 GMT-0700 (MST): {"_id":{"$oid":"50bd413dc9d16682d5f35fe4"},"title":"1","content":"aaa"}

Remove plain text password logging

ReactiveMongo debug log contains the database name, username and password used for the connection. Having password logged in plaintext in logs is a security risk.

Ex.
2013-03-08 04:57:54,736 - [DEBUG] - from reactivemongo.core.actors.MongoDBSystem in mongodb-akka.actor.default-dispatcher-6 AUTH: nothing to do. authenticationHistory is AuthHistory(List((Authenticate(db_name,db_username,password),List())))

NoSuchElementException: None.get with Macros.handler in Specs2 test

I get a java.util.NoSuchElementException: None.get exception during testing. The strange is that this occurs only during testing. The same code works if I access the page in a browser. If I debug the stream during testing then all the needed data is present.

Here is the stack trace:

None.get
java.util.NoSuchElementException: None.get
    at models.Platform$$anon$1.read(Platform.scala:85)
    at models.Platform$$anon$1.read(Platform.scala:85)
    at reactivemongo.bson.BSONReader$$anonfun$readTry$1.apply(handlers.scala:35)
    at reactivemongo.bson.BSONReader$class.readTry(handlers.scala:35)
    at models.Platform$$anon$1.readTry(Platform.scala:85)
    at reactivemongo.bson.BSONValue$ExtendedBSONValue$.asTry$extension(bson.scala:62)
    at reactivemongo.bson.BSONValue$ExtendedBSONValue$.as$extension(bson.scala:65)
    at models.PlatformDAOImpl$$anonfun$1$$anonfun$apply$1.apply(PlatformDAOImpl.scala:72)
    at models.PlatformDAOImpl$$anonfun$1$$anonfun$apply$1.apply(PlatformDAOImpl.scala:72)
    at models.PlatformDAOImpl$$anonfun$1.apply(PlatformDAOImpl.scala:72)
    at models.PlatformDAOImpl$$anonfun$1.apply(PlatformDAOImpl.scala:72)

Here are my models:

case class Platform(
  id: String,
  name: String,
  url: String,
  priority: Int,
  easeOfUse: Int,
  effectiveness: Int,
  features: Int,
  safety: Int,
  popularity: Int,
  established: Int,
  genderRatio: GenderRatio,
  userBase: String,
  teaser: String,
  prices: List[Price]) {

  def points()(implicit lang: Lang) = Points.calc(
    easeOfUse,
    effectiveness,
    features,
    safety,
    popularity,
    genderRatio,
    lang
  )
}

object Platform {
  implicit val platformJSONFormat = Json.format[Platform]
  implicit val platformBSONFormat = Macros.handler[Platform]
}

case class GenderRatio(males: Int, females: Int)

object GenderRatio {
  implicit val genderRatioJSONFormat = Json.format[GenderRatio]
  implicit val genderRatioBSONFormat = Macros.handler[GenderRatio]
}

case class Price(label: String, value: String)

object Price {
  implicit val priceJSONFormat = Json.format[Price]
  implicit val priceBSONFormat = Macros.handler[Price]
}

The query is an aggregate command:

db.command(Aggregate("categories", Seq(
  Match(BSONDocument("id" -> category.id, "lang" -> category.lang.code)),
  Unwind("platforms"),
  Project(
    "_id" -> BSONInteger(0),
    "id" -> BSONString("$platforms.id"),
    "name" -> BSONString("$platforms.name"),
    "url" -> BSONString("$platforms.url"),
    "priority" -> BSONString("$platforms.priority"),
    "easeOfUse" -> BSONString("$platforms.easeOfUse"),
    "effectiveness" -> BSONString("$platforms.effectiveness"),
    "features" -> BSONString("$platforms.features"),
    "safety" -> BSONString("$platforms.safety"),
    "popularity" -> BSONString("$platforms.popularity"),
    "established" -> BSONString("$platforms.established"),
    "genderRatio" -> BSONString("$platforms.genderRatio"),
    "userBase" -> BSONString("$platforms.userBase"),
    "teaser" -> BSONString("$platforms.teaser"),
    "prices" -> BSONString("$platforms.prices")
)))).map { stream => stream.toList.map { doc => doc.as[Platform] }}

Mongo Version is 2.4 latest
ReactiveMongo Version is 0.9

Left not being handled by actor

Hey there, I found some nasty code piece not handling errors :)

[ERROR] [01/27/2013 01:24:58.131] [mongodb-akka.actor.default-dispatcher-7] [akka://mongodb/user/$c] Either.right.value on Left
java.util.NoSuchElementException: Either.right.value on Left
at scala.util.Either$RightProjection.get(Either.scala:454)
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1.applyOrElse(actors.scala:236)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:425)
at akka.actor.ActorCell.invoke(ActorCell.scala:386)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:230)
at akka.dispatch.Mailbox.run(Mailbox.scala:212)
at akka.dispatch.ForkJoinExecutorConfigurator$MailboxExecutionTask.exec(AbstractDispatcher.scala:502)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:262)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:975)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1478)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)

Scala 2.10.0 Snapshot

Hey,

could you please publish a snapshot build with the 2.10.0 release?

Best, Tom

ExecutionContext and testing

Hello,

I am trying to understand what effects the ExecutionContext has on testing and setting up test data because if I insert a document in my specs2 specification, I cannot find it using a helper object (DAO) that declares its own ExecutionContext, similar to MongoController.

For example ( inside Play2's running(FakeApplication()) { ... } ):

val id = BSONObjectID.generate
val user = collection.insert(BSONDocument("_id" -> id, "name" -> BSONString("alex")))
val result = Await.result(user, Duration(1, "seconds"))
result.ok must beTrue

val normal = collection.find(BSONDocument("_id" -> new BSONObjectID(id.stringify))).headOption
val normalResult = Await.result(normal, Duration(1, "seconds"))
normalResult must beSome

The above works fine and I declare the ExecutionContext as

implicit val ec: ExecutionContext = ExecutionContext.Implicits.global

but the problem is that if I try to do something like this, then it returns None

val dao = UserDAO.findById(id.stringify)
val daoResult = Await.result(dao, Duration(1, "seconds"))
daoResult must beSome

This is the code for my UserDAO:

package models

object UserDAO extends MongoDAO[User] {

  override def collection = db("users")

  override implicit val reader = User.UserBSONReader

  def findByEmail(email: String) = findHeadOption("email", email)

  def findByAccessToken(accessToken: String) = findHeadOption("accessToken", accessToken)

}

and MongoDAO, which it extends:

package models

import play.api.Play.current
import play.modules.reactivemongo._
import reactivemongo.bson._
import scala.concurrent.ExecutionContext
import reactivemongo.bson.handlers.BSONReader
import reactivemongo.bson.handlers.DefaultBSONHandlers.DefaultBSONReaderHandler
import reactivemongo.bson.handlers.DefaultBSONHandlers.DefaultBSONDocumentWriter

trait MongoDAO[T] {

  implicit val ec: ExecutionContext = ExecutionContext.Implicits.global

  def db = ReactiveMongoPlugin.db
  def collection: reactivemongo.api.Collection

  implicit val reader: BSONReader[T]

  def findHeadOption(attribute: String, value: String) = {
    collection.find(BSONDocument(attribute -> BSONString(value))).headOption
  }

  def findById(id: String) = {
    collection.find(BSONDocument("_id" -> new BSONObjectID(id))).headOption
  }

}

When I use my DAO functions from within the Play 2 Controller and run the application they work fine, it is only when testing and adding new data that this doesn't seem to work.

Have you any idea what could be going on?

Thanks,
Alex

Future API

I tried to use the project with the last version of scala Futures and it seems that the API used has changed.

java.lang.ClassCastException: scala.util.Failure cannot be cast to scala.Either
    at reactivemongo.api.Failover$$anonfun$reactivemongo$api$Failover$$send$1.apply(api.scala:44)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:33)
    at scala.concurrent.forkjoin2.ForkJoinTask$AdaptedRunnableAction.exec(ForkJoinTask.java:1411)
    at scala.concurrent.forkjoin2.ForkJoinTask.doExec(ForkJoinTask.java:256)
    at scala.concurrent.forkjoin2.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:965)
    at scala.concurrent.forkjoin2.ForkJoinPool.runWorker(ForkJoinPool.java:1468)

If I am correct now it uses Try objects (Success / Failure).

Thanks.

BSONObjectID.isValid

It would be great if we could verify that a string is a valid BSONObjectID.
This would allow us to write a Play path bindable looking something like this

implicit def objectIdPathBindable = new PathBindable[BSONObjectID] {
  def bind(key: String, value: String) = {
    if (BSONObjectID.isValid(value))
      Right(new BSONObjectID(value))
    else
      Left("Cannot parse parameter " + key + " as BSONObjectID")
  }
  def unbind(key: String, value: BSONObjectID) = value.toString
}

The java version of mongo's driver has this:
http://api.mongodb.org/java/2.0/org/bson/types/ObjectId.html#isValid(java.lang.String)

Occasional java.lang.NullPointerException

I have written a play application to test whether GridFS performs better than FileIO.

Sometimes the following error pops up:

[ERROR] [10/20/2012 23:58:43.413] [mongodb-akka.actor.default-dispatcher-17] [akka://mongodb/user/$c] null
java.lang.NullPointerException
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1$$anonfun$apply$17$$anonfun$apply$20.reflMethod$Method2(actors.scala:201)
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1$$anonfun$apply$17$$anonfun$apply$20.apply(actors.scala:201)
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1$$anonfun$apply$17$$anonfun$apply$20.apply(actors.scala:201)
at scala.Option.map(Option.scala:133)
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1$$anonfun$apply$17.apply(actors.scala:201)
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1$$anonfun$apply$17.apply(actors.scala:199)
at scala.collection.Iterator$class.foreach(Iterator.scala:772)
at scala.collection.immutable.VectorIterator.foreach(Vector.scala:648)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:73)
at scala.collection.immutable.Vector.foreach(Vector.scala:63)
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1.apply(actors.scala:199)
at reactivemongo.core.actors.MongoDBSystem$$anonfun$receive$1.apply(actors.scala:143)
at akka.actor.Actor$class.apply(Actor.scala:318)
at reactivemongo.core.actors.MongoDBSystem.apply(actors.scala:90)
at akka.actor.ActorCell.invoke(ActorCell.scala:626)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:197)
at akka.dispatch.Mailbox.run(Mailbox.scala:179)
at akka.dispatch.ForkJoinExecutorConfigurator$MailboxExecutionTask.exec(AbstractDispatcher.scala:516)
at akka.jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:259)
at akka.jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:975)
at akka.jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1479)
at akka.jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104)

Missing index types (Mongo 2.4)

There are several missing index types in IndexType, I would have needed at least "text" and "2dsphere" to name some, the "text" is experimental but "2dsphere" is quite useful. This is made worse by the fact that IndexType is a sealed trait, which makes it impossible to add new types. Now I have to fall back to manually editing system.indexes, feels ugly...

Edit: I realized these are all related to new Mongo 2.4 index types, so maybe updating to support 2.4 would be more proper topic.

Feature request: collection.save()

Of course we can use collection.update(Json.obj("_id" -> 1), Json.obj(update), upsert = true) but it feels more "mongoish" if I have that method on the collection available.

sbt test failed with MongoDB v.2.2.2 on latest master

Is it ok?

Console output:

...
[info] IndexesSpec
[info]
[info] ReactiveMongo Geo Indexes should
[error] ! insert some points
[error] TimeoutException: Futures timed out after 5 seconds
[error] IndexesSpec$$anonfun$1$$anonfun$apply$9.apply(IndexesSpec.scala:22)
[error] IndexesSpec$$anonfun$1$$anonfun$apply$9.apply(IndexesSpec.scala:18)
[info] + make index
[info] + fail to insert some points out of range
[info] + retrieve indexes
[info]
[info]
[info] Total for specification IndexesSpec
[info] Finished in 9 seconds, 383 ms
[info] 4 examples, 0 failure, 1 error
...
[info] CollectionSpec
[info]
[info] ReactiveMongo should
[info] + create a collection
[info] + convert to capped
[error] x check if it's capped
[error] the value is not equal to 'true' (CollectionSpec.scala:27)
[info] + insert some docs then count
[info] + empty the capped collection
[info] + drop it
[info]
[info]
[info] Total for specification CollectionSpec
[info] Finished in 14 seconds, 60 ms
[info] 6 examples, 1 failure, 0 error
[info]
[error] Error: Total 36, Failed 1, Errors 1, Passed 34, Skipped 0
[error] Failed tests:
[error] CollectionSpec
[error] Error during tests:
[error] IndexesSpec
...

What am I doing wrong? (Non-blocking reads possible?)

Hi!

I am attempting to use play in a non-blocking way. Basically I want to "pause" a request while doing a database read, so the thread servicing my request could go service another request.

Am I right in assuming there is no way to do this, even with reactive? For example, I had the following two code blocks:

  def blockingCheck = Action {

    val result: Identity = Await.result(
      IdentityDAO.find(
        BSONDocument("_id" -> BSONObjectID("515e303bf3360ea4b769370a"))
      ).headOption()
      , Duration.create(1, "min")).get

    Ok(result.email.get)
  }

  def nonBlockingCheck = Action {
    implicit request =>
      Async {
        implicit val reader = IdentityDAO.identityBSONReader
        val query = BSONDocument("_id" -> BSONObjectID("515e303bf3360ea4b769370a"))

        val result: FlattenedCursor[Identity] = IdentityDAO.find(query)

        result.toList.map(R =>
          Ok(R.head.email.get)
        )

      }
  }

I ran them both, and got identical performance ( see http://imgur.com/a/h3p7Z ).

I guess reactive would still win to do an async save to the database.. but there isn't really any way to do my reads async? Or did I botch the code above?

Thanks!

Project command does only support integer values

As described in the mongo documentation, the project command can also accept other values like strings and objects. ReactiveMongo should support these types too.

Current signature is:

case class Project(fields: (String, Int)*) extends PipelineOperator

But it should be:

case class Project(fields: (String, Any)*) extends PipelineOperator

Or better:

case class Project(fields: (String, BSONValue)*) extends PipelineOperator

bulkInsert doesn't like an empty enumerator

When I try to do a bulk insert from a temporary map reduce collection into another collection I get the following exception if the temporary collection is empty:

reactivemongo.core.commands.LastError: DatabaseException['Message contains no documents' (code = 13066)]

Ideally bulkInsert should just happily do nothing when given an enumerator that consists of only EOF.

Metadata is "flattened" in GridFS file document

I'm writing a file to GridFS with custom metadata:

DefaultFileToSave(theName, theContentType,
  metadata = BSONDocument("foo" -> BSONString("bar")))

According to the GridFS spec, the metadata should be stored in MongoDB as a subdocument:

{ <default fields (_id, fileName...) >, metadata: {foo: 'bar'} }

However, the BSONDocument I pass is merged with the main document (I tracked it down to gridfs.scala, line 197), resulting in:

{ <default fields>, foo: 'bar' }

Is this a bug or do you expect the metadata property to be passed verbatim from the Scala code? (which I will do in the meantime anyway)

Add support for aggregation framework

MongoDB has the map/reduce and the new Aggregation Framework for data processing. ReactiveMongo would be a great driver for these, as the non-blocking nature would be beneficial for these sometimes long running jobs.

Wrong package naming ?

Everywhere reactivemongo is used as the top package reference (import reactivemongo....), but there is no reactivemongo anymore but scala. I am confused.

When I check this out I get errors complaining about that. Do I miss something ?

More handy way with implicit reader (awkward imports)

Hello Stephane,
I am playing with 0.9 snapshot using BSONDocumentReader typeclass. It works well but I need to write awkward imports to make it work, otherwise implicit conversions from bson._ conflict with my reader.

Look at this example:

import reactivemongo.bson.{BSONDocument, BSONDocumentReader, BSONDocumentWriter, BSONString}

case class Person(firstName: String, lastName: String)

object PersonReader extends BSONDocumentReader[Person] {
def read(doc: BSONDocument) = Person(
doc.getAsString.get,
doc.getAsString.get)
}

def test2a = {
val cursor = collection.find(BSONDocument("firstName" -> "Jack")).cursor
cursor.enumerate.apply(Iteratee.foreach { doc =>
println("test2a: " + doc)
})
}

I can not write "import reactivemongo.bson._" here because I get error:
ambiguous implicit values:
[fsc] both value reader of type TestMongo.PersonReader.type
[fsc] and object BSONDocumentIdentity in trait DefaultBSONHandlers of type DefaultBSONHandlers.this.BSONDocumentIdentity.type
[fsc] match expected type reactivemongo.bson.BSONDocumentReader[T]
[fsc] val cursor = collection.find(BSONDocument("firstName" -> "Jack")).cursor

Also I must introduce a local "implicit val reader" here, I can not mark object PersonReader as implicit due to same reason.

To simplify this, seems default implicits could be placed into a subpackage of 'bson'. Or maybe I miss something?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.