Giter VIP home page Giter VIP logo

faraday's Introduction

Faraday

Amazon DynamoDB client for Clojure

  • API reference: Codox, clj-doc
  • Leiningen: [com.taoensso/faraday "1.12.2"]
  • deps.edn: com.taoensso/faraday {:mvn/version "1.12.2"}

Main tests Graal tests

DynamoDB makes a great companion for Clojure apps that need a simple, reliable way to persist data, that scales with predictable performance. Faraday is a small, fast and intuitive DynamoDB client library for Clojure, built around the AWS Java SDK and originally adapted from Rotary by James Reeves.

Why Faraday?

  • Small and simple API, with coverage of the most useful DynamoDB features
  • Great performance (zero overhead to the official Java SDK)
  • Uses Nippy for full support of Clojure's rich data types (with compression too)
  • The AWS Java SDK for DynamoDB is awkward and verbose
  • General purpose AWS SDKs for Clojure such as Amazonica or aws-api inherit the awkwardness of the AWS SDK when used to interact with DynamoDB

Getting started

Add Faraday as a dependency to your project and import faraday into your namespace:

(ns my-ns
 (:require [taoensso.faraday :as far]))

Preparing a database

Option 1 - Run a local DynamoDB instance

First thing is to start a DynamoDB Local instance. Once DynamoDB Local is up and running in your terminal, you should see something like:

$ docker run -p 8000:8000 amazon/dynamodb-local
Initializing DynamoDB Local with the following configuration:
Port:		8000
InMemory:	true
DbPath:		null
SharedDb:	false
shouldDelayTransientStatuses:	false
CorsParams:	*

Then proceed to connecting with your local instance in the next section.

Option 2 - Use DynamoDB in the cloud

Make sure you've got an AWS account - note that there's a free tier with limited DynamoDB storage and read/write throughput. Next you'll need credentials for an IAM user with read/write access to your DynamoDB tables.

Ready?

Connecting

(def client-opts
  {;;; For DynamoDB Local just use some random strings here, otherwise include your
   ;;; production IAM keys:
   :access-key "<AWS_DYNAMODB_ACCESS_KEY>"
   :secret-key "<AWS_DYNAMODB_SECRET_KEY>"

   ;;; You may optionally override the default endpoint if you'd like to use DynamoDB
   ;;; Local or a different AWS Region (Ref. http://goo.gl/YmV80o), etc.:
   ;; :endpoint "http://localhost:8000"                   ; For DynamoDB Local
   ;; :endpoint "http://dynamodb.eu-west-1.amazonaws.com" ; For EU West 1 AWS region

   ;;; You may optionally provide your own (pre-configured) instance of the Amazon
   ;;; DynamoDB client for Faraday functions to use.
   ;; :client (AmazonDynamoDBClientBuilder/defaultClient)
  })

(far/list-tables client-opts)
=> [] ; That's good, we don't have any tables yet :)

Now let's create a table. This is actually one of the more complicated parts of working with DynamoDB since it requires understanding how DynamoDB provisions capacity and how its idiosyncratic primary keys work. We can safely ignore the specifics for now.

(far/create-table client-opts :my-table
  [:id :n]  ; Primary key named "id", (:n => number type)
  {:throughput {:read 1 :write 1} ; Read & write capacity (units/sec)
   :block? true ; Block thread during table creation
   })

;; Wait a minute for the table to be created... got a sandwich handy?

(far/list-tables client-opts)
=> [:my-table] ; There's our new table!

Let's write something to :my-table and then fetch it:

(far/put-item client-opts
    :my-table
    {:id 0 ; Remember that this is our primary (indexed) key
     :name "Steve" :age 22 :data (far/freeze {:vector    [1 2 3]
                                              :set      #{1 2 3}
                                              :rational (/ 22 7)
                                              ;; ... Any Clojure data goodness
                                              })})

(far/get-item client-opts :my-table {:id 0})
=> {:id 0 :name "Steve" :age 22 :data {:vector [1 2 3] ...}}

Remaining API

DynamoDB gives you tons of power including secondary indexes, conditional writes, batch operations, atomic counters, tuneable read consistency and more.

Most of this stuff is controlled through optional arguments and is pretty easy to pick up by seeing the relevant [API] docs:

Tables: list-tables, describe-table, create-table, ensure-table, update-table, delete-table.

Items: get-item, put-item, update-item, delete-item.

Batch items: batch-get-item, batch-write-item.

Querying: query, scan, scan-parallel.

Transactions: transact-write-items, transact-get-items

You can also check out the official AWS DynamoDB documentation, though there's a lot of Java-land complexity that you won't need to deal with when using Faraday. The most useful single doc is probably on the DynamoDB data model and the DynamoDB Best Practices.

Development

This project uses Testcontainers to manage starting and stopping a local DynamoDB instance in docker.

Run the tests locally with:

lein test

Or run tests from a REPL like:

taoensso.faraday.tests.main> (clojure.test/run-tests)

To run the entire test suite against all supported versions of Clojure, use:

lein test-all

Contributions

Please see GitHub issues for bugs, ideas, etc. Pull requests welcome. For a general question on usage, try StackOverflow or ask the Faraday users and developers in #faraday at clojurians.slack.com.

License

Copyright © 2013-2023 Peter Taoussanis and contributors, licensed under EPL 1.0 (same as Clojure).

faraday's People

Contributors

bpot avatar ghoseb avatar gowind avatar gusbicalho avatar jaley avatar jeffh avatar joelittlejohn avatar johnchapin avatar jonathanharford avatar kipz avatar lambda-knight avatar madeye-matt avatar marcuswr avatar maxcountryman avatar michaelblume avatar moea avatar mouryaravi avatar neilprosser avatar paraseba avatar paulbutcher avatar philippkueng avatar ptaoussanis avatar quantisan avatar rakeshp avatar ricardojmendez avatar sheelc avatar smithtim avatar theleoborges avatar xabi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

faraday's Issues

updateTable hangs if passed the same throughput as the table currently has

If we pass the same throughput as the table currently has (or no throughput at all, which defaults to the same) the update-table call hangs. I initially thought that the table's state never changed to :updating, but looking at table-status-watch that wouldn't have affected it, since it only checks that the current status is not the one passed.

Execution actually hangs when we call AmazonDynamoDB.updateTable.

This probably hadn't manifested before removing stepping (see #78) because if the throughput was the same as before, no steps were generated and the function was never called. Adding a single step to the old implementation with the same values as the current throughput reproduces the issue.

This happens against both the local database and a live instance.

API freeze + release v1.0.0 (feedback wanted)

Getting close to a v1.0.0 (stable API) release. Any remaining issues / suggested changes? Any anecdotes of folks successfully using the current API in production? Feel free to mail me [ptaoussanis at taoensso dot com] if you'd prefer to discuss in private.

Cheers! :-)

1.3.1 causes a new error

I guess it is a dependency problem.

When I type (ns my-app (:require [taoensso.faraday :as far])), I got a stack trace:

Wrong number of args (1) passed to: core$--GT
[Thrown class clojure.lang.ArityException]

Restarts:
0: [QUIT] Quit to the SLIME top level

Backtrace:
0: Compiler.java:6473 clojure.lang.Compiler.macroexpand1
1: Compiler.java:6546 clojure.lang.Compiler.analyzeSeq
2: Compiler.java:6361 clojure.lang.Compiler.analyze
3: Compiler.java:6322 clojure.lang.Compiler.analyze
4: Compiler.java:8412 clojure.lang.Compiler$CaseExpr$Parser.parse
5: Compiler.java:6560 clojure.lang.Compiler.analyzeSeq
6: Compiler.java:6361 clojure.lang.Compiler.analyze
7: Compiler.java:6322 clojure.lang.Compiler.analyze
8: Compiler.java:5708 clojure.lang.Compiler$BodyExpr$Parser.parse
9: Compiler.java:6009 clojure.lang.Compiler$LetExpr$Parser.parse


lein deps :tree

WARNING!!! possible confusing dependencies found:
[lib-noir "0.8.1"] -> [ring "1.2.0"]
overrides
[ring-server "0.3.1"] -> [ring "1.2.1"]

[lib-noir "0.8.1"] -> [ring-middleware-format "0.3.2"] -> [org.clojure/tools.reader "0.7.10"]
overrides
[markdown-clj "0.9.41"] -> [org.clojure/clojurescript "0.0-2127"] -> [org.clojure/tools.reader "0.8.0"]
and
[com.taoensso/faraday "1.3.1"] -> [com.taoensso/encore "1.6.0"] -> [org.clojure/tools.reader "0.8.3"]
and
[com.taoensso/faraday "1.3.1"] -> [com.taoensso/nippy "2.6.3"] -> [org.clojure/tools.reader "0.8.3"]

[compojure "1.1.6"] -> [org.clojure/core.incubator "0.1.0"]
overrides
[com.cemerick/friend "0.2.0"] -> [org.clojure/core.incubator "0.1.1"]

[compojure "1.1.6"] -> [org.clojure/tools.macro "0.1.0"]
overrides
[com.taoensso/timbre "3.0.0"] -> [org.clojure/tools.macro "0.1.5"]
and
[com.taoensso/tower "2.0.2"] -> [org.clojure/tools.macro "0.1.5"]
and
[com.taoensso/tower "2.0.2"] -> [com.taoensso/timbre "2.7.1"] -> [org.clojure/tools.macro "0.1.5"]

[ring/ring-devel "1.2.1"] -> [clj-stacktrace "0.2.5"]
overrides
[com.taoensso/tower "2.0.2"] -> [com.taoensso/timbre "2.7.1"] -> [clj-stacktrace "0.2.7"]

[clojure-complete "0.2.3" :exclusions [[org.clojure/clojure]]]
[com.cemerick/friend "0.2.0"]
[com.google.inject/guice "2.0"]
[aopalliance "1.0"]
[net.sourceforge.nekohtml/nekohtml "1.9.10"]
[xerces/xercesImpl "2.8.1"]
[xml-apis "1.3.03"]
[org.apache.httpcomponents/httpclient "4.2.1"]
[org.apache.httpcomponents/httpcore "4.2.1"]
[org.openid4java/openid4java-nodeps "0.9.6" :exclusions [[com.google.code.guice/guice]]]
[commons-logging "1.1.1"]
[net.jcip/jcip-annotations "1.0"]
[robert/hooke "1.1.2"]
[slingshot "0.10.2"]
[com.taoensso/faraday "1.3.1"]
[com.amazonaws/aws-java-sdk "1.7.8" :exclusions [[joda-time]]]
[com.fasterxml.jackson.core/jackson-annotations "2.1.1"]
[com.fasterxml.jackson.core/jackson-databind "2.1.1"]
[com.taoensso/encore "1.6.0"]
[com.taoensso/nippy "2.6.3"]
[org.iq80.snappy/snappy "0.3"]
[org.tukaani/xz "1.5"]
[com.taoensso/timbre "3.0.0"]
[io.aviso/pretty "0.1.8"]
[com.taoensso/tower "2.0.2"]
[compojure "1.1.6"]
[org.clojure/core.incubator "0.1.0"]
[org.clojure/tools.macro "0.1.0"]
[ring/ring-core "1.2.1"]
[commons-fileupload "1.3"]
[commons-io "2.4"]
[environ "0.4.0"]
[im.chit/cronj "1.0.1"]
[clj-time "0.6.0"]
[im.chit/hara "1.0.1"]
[im.chit/ova "1.0.1"]
[lib-noir "0.8.1"]
[cheshire "5.3.1"]
[com.fasterxml.jackson.core/jackson-core "2.3.1"]
[com.fasterxml.jackson.dataformat/jackson-dataformat-smile "2.3.1"]
[tigris "0.1.1"]
[clout "1.1.0"]
[hiccup "1.0.4"]
[org.mindrot/jbcrypt "0.3m"]
[ring-middleware-format "0.3.2"]
[clj-yaml "0.4.0"]
[org.yaml/snakeyaml "1.5"]
[com.ibm.icu/icu4j "52.1"]
[org.clojure/core.memoize "0.5.6"]
[org.clojure/tools.reader "0.7.10"]
[ring "1.2.0"]
[ring/ring-jetty-adapter "1.2.0"]
[org.eclipse.jetty/jetty-server "7.6.8.v20121106"]
[org.eclipse.jetty.orbit/javax.servlet "2.5.0.v201103041518"]
[org.eclipse.jetty/jetty-continuation "7.6.8.v20121106"]
[org.eclipse.jetty/jetty-http "7.6.8.v20121106"]
[org.eclipse.jetty/jetty-io "7.6.8.v20121106"]
[org.eclipse.jetty/jetty-util "7.6.8.v20121106"]
[ring/ring-servlet "1.2.0"]
[javax.servlet/servlet-api "2.5"]
[markdown-clj "0.9.41"]
[org.clojure/clojurescript "0.0-2127"]
[com.google.javascript/closure-compiler "v20131014"]
[args4j "2.0.16"]
[com.google.code.findbugs/jsr305 "1.3.9"]
[com.google.guava/guava "15.0"]
[com.google.protobuf/protobuf-java "2.4.1"]
[org.json/json "20090211"]
[org.clojure/data.json "0.2.3"]
[org.clojure/google-closure-library "0.0-20130212-95c19e7f0f5f"]
[org.clojure/google-closure-library-third-party "0.0-20130212-95c19e7f0f5f"]
[org.mozilla/rhino "1.7R4"]
[org.clojure/clojure "1.5.1"]
[org.clojure/core.cache "0.6.3"]
[org.clojure/data.priority-map "0.0.2"]
[org.clojure/tools.nrepl "0.2.3" :exclusions [[org.clojure/clojure]]]
[ring-mock "0.1.5"]
[ring/ring-codec "1.0.0"]
[ring-server "0.3.1"]
[ring-refresh "0.1.2"]
[watchtower "0.1.1"]
[ring/ring-devel "1.2.1"]
[clj-stacktrace "0.2.5"]
[ns-tracker "0.2.1"]
[org.clojure/java.classpath "0.2.0"]
[org.clojure/tools.namespace "0.1.3"]
[selmer "0.6.1"]
[commons-codec "1.9"]
[joda-time "2.3"]

Queries with compound conditions

For example, suppose there is a table A with attribute B. I would like to collect all items such that (A.B = 0 OR A.B = 2). From what I see in the library, this doesn't seem to be implemented yet and would require one to execute a more general query and filter the results manually. How difficult would it be to implement compound conditional operators :or and :and and the rest of the condition expression syntax mentioned here?

If you could give me a quick idea of which parts of the API have to be changed, I can submit a PR a bit faster. Thanks.

Unprocessed items in wrong format

The unprocessed items returned after a throttled batch-write is in a mixed format, it seems:

{:unprocessed #<HashMap {
  mytable=
    [{PutRequest: 
      {Item: {id={S: "808b2e40e",}, 
              uuid={S: d1fe6a7e-c80d-41dc-a14d-afe848fe9b71,}}},
  }
  ...
}

If I try to send the value of :unprocessed to batch-write again, it fails with a not very illuminating ClassCastException:

NullPointerException   clojure.lang.Reflector.invokeNoArgInstanceMember (Reflector.java:296)
java.lang.ClassCastException: null
 at 

Implementation discussion: update-table changes

I'm looking into expanding update-table to add other functionality, like GlobalSecondaryIndexUpdates, since as discussed on #77, it only currently supports provisioned throughput for the table.

I see the following issues:

  • Currently it expects a throughput map as a parameter, when you don't necessarily want to change the throughput for the table.
  • The map is specifically for throughput, when we should be receiving a map of values to change, for example, a description of :gsindexes updates to do.

I'm inclined to change that third parameter to update-table to just be a map with the settings for the operations to apply. This would be consistent with how the table description is passed to create-table and would solve both problems above, but since that'd be a breaking change I'd rather bring it up first.

Thoughts?

batch-write-item failing for set with > 1 element

Using faraday 1.4.0 against dynamodb-local.

(create-table client-opts :mytable [:id :n]
    {:throughput {:read 1 :write 1} :block? true})

(batch-write-item client-opts
    {:mytable {:put [{:id 1 :s #{"a" "b"}}]}})

Results in

AmazonServiceException Cannot perform multiple operations on the same item in a BatchWriteItem (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: 93d9cf35-a1a4-4ad1-bceb-bc0a192f17e8)  com.amazonaws.http.AmazonHttpClient.handleErrorResponse (AmazonHttpClient.java:805)

Whereas this works fine:

(far/batch-write-item client-opts
    {:mytable {:put [{:id 1 :s #{"a"}}]}})

And put-item requests also work for the example that fails with batch-write-item.

Finally, a Java implementation of the problematic BatchWriteItem request works fine (knocked up just to make sure it's not an obvious dynamodb-local problem).

Slim down AWS dependencies

Hello!

[com.amazonaws/aws-java-sdk "1.9.25" :exclusions [joda-time]]

Does Faraday need the full AWS SDK? It makes for a rather large uberjar. Presumably only the DDB, and maybe core jars are required? Amazon provide separate jars for the different services.

Question about calling freeze

(I am not sure if this is a good place to ask. If not, sorry)

This is a design decision question. Why non DynamoDB type data have to be manually converted by calling freeze instead of auto-magically done?
In this way, I don't need to know about freeze at all and can work in terms of 'Clojure types' always.

For example,

Why not:
(far/put-item client-opts :mttable {:id ["abc" "def"]}) ;; internally call freeze here
instead of
(far/put-item client-opts :mttable {:id (far/freeze ["abc" "def"])})

Also
(far/get-item client-opts :mttable {:id ["abc" "def"]}) ;; call freeze internally
than
(far/get-item client-opts :mttable {:id (far/freeze ["abc" "def"])})

Thanks!

Connect without explicit key/secret client-opts?

Is it possible to overload get/query etc so that they use the IAM environment rather than explicit user/password combinations.

I know the SQS api for example has overloads that don't require credentials and therefore defaults to IAM roles, so I'm wondering if the underlying Dynamo api and/or faraday are capable of doing this.

Currently I have to put the password directly into the shell scripts environment in order to connect.

Thanks.

Terrible Write Performance with DynamoDB Local

It's taking about 150 ms to put a single item into a table using the samples from the intro.md. I'm testing simply inputting a single string value.

The docs from amazon say that throughput is ignored for DynamoDB Local.

Is there anything I can do to speed this up? At this point, MongoDB is about 100x faster to insert one item. My data is rather large and will take months or years to import at this rate :).

Support for regions

Working in Europe means we need to keep all our data in the Amazon EU zones. A quick grep of the code shows faraday to be region agnostic and thus assumes all data is held in (presumably) one of the US zones.

In the AWS API region can be set on the AmazonDBClient as follows (using clojure syntax):
(.setRegion (com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.) (com.amazonaws.regions.Region/getRegion com.amazonaws.regions.Regions/EU_WEST_1))

Might have a look at this myself if I get time - for now leaving it as a feature request :-)

Should update-table still return a promise?

Given that:

  • We can only do one update at a time
  • We no longer need to do multiple updates for stepping after 1640651

Do we still need update-table to return a promise? Doesn't it make more sense to make it a blocking call, since we don't want to execute multiple updates while one is already running?

Although (thinking out loud) if the update operation takes a while, we wouldn't want to block the thread for the duration...

:limit on scan throws ClassCastException

The following:

(far/scan creds :myTable {:limit 1})

throws a ClassCastException:

java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Integer
 at taoensso.faraday$scan$run1__3040.invoke (faraday.clj:739)
    taoensso.faraday$scan.doInvoke (faraday.clj:746)
    clojure.lang.RestFn.invoke (RestFn.java:442)
    dynamodb_client.core$eval3205.invoke (form-init2217546699762230571.clj:1)
    clojure.lang.Compiler.eval (Compiler.java:6619)
    clojure.lang.Compiler.eval (Compiler.java:6582)
    clojure.core$eval.invoke (core.clj:2852)
    clojure.main$repl$read_eval_print__6588$fn__6591.invoke (main.clj:259)
    clojure.main$repl$read_eval_print__6588.invoke (main.clj:259)
    clojure.main$repl$fn__6597.invoke (main.clj:277)
    clojure.main$repl.doInvoke (main.clj:277)
    clojure.lang.RestFn.invoke (RestFn.java:1096)
    clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__591.invoke (interruptible_eval.clj:56)
    clojure.lang.AFn.applyToHelper (AFn.java:159)
    clojure.lang.AFn.applyTo (AFn.java:151)
    clojure.core$apply.invoke (core.clj:617)
    clojure.core$with_bindings_STAR_.doInvoke (core.clj:1788)
    clojure.lang.RestFn.invoke (RestFn.java:425)
    clojure.tools.nrepl.middleware.interruptible_eval$evaluate.invoke (interruptible_eval.clj:41)
    clojure.tools.nrepl.middleware.interruptible_eval$interruptible_eval$fn__632$fn__635.invoke (interruptible_eval.clj:171)
    clojure.core$comp$fn__4154.invoke (core.clj:2330)
    clojure.tools.nrepl.middleware.interruptible_eval$run_next$fn__625.invoke (interruptible_eval.clj:138)
    clojure.lang.AFn.run (AFn.java:24)
    java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1145)
    java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:615)
    java.lang.Thread.run (Thread.java:724)

Issue with update-map being deprecated on update-item

There's a warning that update-map has been deprecated on update-item in favor of :cond-expr. Two issues:

  • Users can't avoid sending it, since it's one of the function's expected parameters
  • It seems like the message is wrong and what users are expected to send is :update-expr, not :cond-expr

Given that we're already doing breaking changes for 1.9, isn't this a good time to remove it altogether?

Release 1.6.0

In order to get 1.6.0 out the door some testing needs to be done.

I'm putting these together over at mixradio-forks/faraday, so if there are any comments or test suggestions create issues over there.

I've taken the step to include the local dynamo runner in the repo. This will allow contributors in the future to just clone, test and get on their way. Let me know if that breaks an assumption somewhere.

Feature: create/delete/update indexes on update-table

Expand update-table to support modifying indexes. While UpdateTableRequest has a list of global secondary index updates, if I attempt to add more than one I get:

Subscriber limit exceeded: Only 1 online index can be created or deleted simultaneously per table

So I'll likely restrict it to one operation at a time.

Proxy bug

Seems to be a bug when creating a dynamo client with specific proxy host and port

put-item :return crashing

(far/put-item client-opts "foo" {:uri "yyy" :_id "42"} {:return :all-new})

AmazonServiceException Return values set to invalid value

Am I missing something simple or is this a bug?
Thanks.

Query docs don't specify <values>

The query docs specify prim-key-conds, but don't say what <values> should be. After looking at the sources, I concluded that it must be a collection, so I used a vector and that worked.

Release 1.9 alpha 3

Hey Peter,

I think I'm done with the major changes for now - this would be a good point at which to send out an alpha 3 to Clojars.

Cheers!

ClassNotFoundException requiring faraday

Source:

(ns twit.core
  (:require [taoensso.faraday :as dynamo])
  (:gen-class))

(defn -main
  "Test"
  [& args]
  (println "Sample"))

Dependencies:

[[org.clojure/clojure "1.5.1"]
 [com.cemerick/bandalore "0.0.4"]
 [com.taoensso/faraday "1.0.0-RC3"]
 [twitter-api "0.7.4"]]

Environment:

Mac OS 10.9
Leiningen 2.3.3 on Java 1.7.0_45 Java HotSpot(TM) 64-Bit Server VM

Full Stack Trace:

Exception in thread "main" java.lang.ClassNotFoundException: com.amazonaws.services.dynamodbv2.model.AttributeDefinition
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at clojure.lang.DynamicClassLoader.findClass(DynamicClassLoader.java:61)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:190)
    at taoensso.faraday$eval26$loading__4910__auto____27.invoke(faraday.clj:1)
    at taoensso.faraday$eval26.invoke(faraday.clj:1)
    at clojure.lang.Compiler.eval(Compiler.java:6619)
    at clojure.lang.Compiler.eval(Compiler.java:6608)
    at clojure.lang.Compiler.load(Compiler.java:7064)
    at clojure.lang.RT.loadResourceScript(RT.java:370)
    at clojure.lang.RT.loadResourceScript(RT.java:361)
    at clojure.lang.RT.load(RT.java:440)
    at clojure.lang.RT.load(RT.java:411)
    at clojure.core$load$fn__5018.invoke(core.clj:5530)
    at clojure.core$load.doInvoke(core.clj:5529)
    at clojure.lang.RestFn.invoke(RestFn.java:408)
    at clojure.core$load_one.invoke(core.clj:5336)
    at clojure.core$load_lib$fn__4967.invoke(core.clj:5375)
    at clojure.core$load_lib.doInvoke(core.clj:5374)
    at clojure.lang.RestFn.applyTo(RestFn.java:142)
    at clojure.core$apply.invoke(core.clj:619)
    at clojure.core$load_libs.doInvoke(core.clj:5413)
    at clojure.lang.RestFn.applyTo(RestFn.java:137)
    at clojure.core$apply.invoke(core.clj:619)
    at clojure.core$require.doInvoke(core.clj:5496)
    at clojure.lang.RestFn.invoke(RestFn.java:408)
    at twit.core$eval20$loading__4910__auto____21.invoke(core.clj:1)
    at twit.core$eval20.invoke(core.clj:1)
    at clojure.lang.Compiler.eval(Compiler.java:6619)
    at clojure.lang.Compiler.eval(Compiler.java:6608)
    at clojure.lang.Compiler.load(Compiler.java:7064)
    at clojure.lang.RT.loadResourceScript(RT.java:370)
    at clojure.lang.RT.loadResourceScript(RT.java:361)
    at clojure.lang.RT.load(RT.java:440)
    at clojure.lang.RT.load(RT.java:411)
    at clojure.core$load$fn__5018.invoke(core.clj:5530)
    at clojure.core$load.doInvoke(core.clj:5529)
    at clojure.lang.RestFn.invoke(RestFn.java:408)
    at clojure.core$load_one.invoke(core.clj:5336)
    at clojure.core$load_lib$fn__4967.invoke(core.clj:5375)
    at clojure.core$load_lib.doInvoke(core.clj:5374)
    at clojure.lang.RestFn.applyTo(RestFn.java:142)
    at clojure.core$apply.invoke(core.clj:619)
    at clojure.core$load_libs.doInvoke(core.clj:5413)
    at clojure.lang.RestFn.applyTo(RestFn.java:137)
    at clojure.core$apply.invoke(core.clj:619)
    at clojure.core$require.doInvoke(core.clj:5496)
    at clojure.lang.RestFn.invoke(RestFn.java:408)
    at user$eval5$fn__7.invoke(form-init8013287035828196233.clj:1)
    at user$eval5.invoke(form-init8013287035828196233.clj:1)
    at clojure.lang.Compiler.eval(Compiler.java:6619)
    at clojure.lang.Compiler.eval(Compiler.java:6609)
    at clojure.lang.Compiler.load(Compiler.java:7064)
    at clojure.lang.Compiler.loadFile(Compiler.java:7020)
    at clojure.main$load_script.invoke(main.clj:294)
    at clojure.main$init_opt.invoke(main.clj:299)
    at clojure.main$initialize.invoke(main.clj:327)
    at clojure.main$null_opt.invoke(main.clj:362)
    at clojure.main$main.doInvoke(main.clj:440)
    at clojure.lang.RestFn.invoke(RestFn.java:421)
    at clojure.lang.Var.invoke(Var.java:419)
    at clojure.lang.AFn.applyToHelper(AFn.java:163)
    at clojure.lang.Var.applyTo(Var.java:532)
    at clojure.main.main(main.java:37)

batch-write-item product behaviour causing problems inserting items with lists

If we have top level lists in our items to insert e.g

{:put [{:id 0 :foo [1 2]}, {:id 1}]}

it seems the request for item with :id 0 will be duplicated as

[{:id 0, :foo [1]}, {:id 0, :foo [2]}, ...] 

I believe you take a product when list values are involved to support deletes as documented in the doc string for batch-write-item:

{:users {:put    [{:user-id 1 :username \"sally\"}
                        {:user-id 2 :username \"jane\"}]
             :delete [{:user-id [3 4 5]}]}})

Obviously this causes problems when the intention is to put a list into dynamo.

Hopefully that is clear, comment if it isn't!

Regards

Dan

Support for BigDecimal

Right now, the suggested way to persist BigDecimal with faraday is to use nippy serialization. This of course clashes with a need to access the data not from faraday library.

One quick fix that I came up with is altering taosensso.faraday/simple-num? function to return true for BigDecimals as well:

(defn- simple-num? [x] (or (instance? Long       x)
                           (instance? Double     x)
                           (instance? Integer    x)
                           (instance? Float      x)
                           (instance? BigDecimal x)))

As BigDecimal is supported by the DynamoDB's Java driver, the only thing we might lose with that approach is BigDecimal's precision or rounding mode - but I suppose that lose that with PostgreSQL for example as well.

Handling update-table exceptions

If I'm reading this correctly... if we have an exception on update-table, then the promise will never be delivered, and any calls deref'ing it will hang. Is that correct? Is there a better way to handle this, for instance, delivering nil?

I'm new to futures and promises, and there doesn't seem to be a way to cancel an un-delivered promise like there is to cancel a future.

put-item :expected not working as documented

(far/create-table client-opts "test" [:id :s])

(far/query client-opts "test" {:id [:eq "x"]})

[] ;; no records exist

(far/put-item client-opts "test" {:id "x"} {:expected {:id false}})

ConditionalCheckFailedException

I assumed the first time, the put should succeed, and the second time it should throw. The docs says to provide {:attr false} to ensure it doesn't already exist, but this results in it always failing, even the first time.

Thanks.

How to do paging with scan-parallel?

I'm trying to do paging with scan-parallel using :limit, but I'm not sure how to specify :last-prim-kvs in subsequent calls. Each segment needs its own last key, I presume.

Am I missing something, or is paging not implemented for parallel scan?

Automatic de/serialization does not work for boolean value false

I haven't looked at internals, but should be a trivial fix, I guess. Thx

my-app> (far/put-item client-opts :my-table {:id 4 :true (far/freeze true)})
nil
my-app> (far/put-item client-opts :my-table {:id 5 :false (far/freeze false)})
nil
my-app> (far/get-item client-opts :my-table {:id 4})
{:id 4N, :true true}
my-app> (far/get-item client-opts :my-table {:id 5})
{:id 5N, :false nil}
my-app>

Add convenient support for new DynamoDB lib features

Transaction library (July 2013): http://aws.typepad.com/aws/2013/07/dynamodb-transaction-library.html
Local testing library (Sep 2013): http://aws.typepad.com/aws/2013/09/dynamodb-local-for-desktop-development.html
Geospatial library (Sep 2013): http://aws.typepad.com/aws/2013/09/new-geo-library-for-dynamodb-.html

Anything else?

Will be swamped with work for the next few months, so any assistance with these would be very welcome. All libs expose a simple(ish) Java API; would just need a little wrapping to make them more convenient to use from Clojure/Faraday.

:return :count doesn't work

(far/query ddb t {:foo foo}
                          {:index "my-index"
                           :return :count})

Returns [] rather than a count. It appears merge-more is to blame.

Automatic IDs?

Is there a way to auto-generate the next unused ID for a primary key?

Does update-table only support provisioned throughput?

I'd like to confirm - does update-table only support provisioned throughput? I can provide a pull request to add other functionality as well, such as updating secondary indexes, but wanted to confirm first that it's not already supported elsewhere (I'm new to faraday).

Boolean not supported?

I'm getting a java.lang.Boolean not supported when I try to put-item with a boolean value. It seems that DynamoDB support Boolean as a data type though. Am I missing something or is this yet to be implemented?

If it's the latter, could you point me in the right direction please, then I can submit a patch for it.

Support for asynchronous Java client

I have a need for a Clojure interface to Amazon's asynchronous Dynamo client, and Faraday is in written in such a way that it would be relatively easy to support this.

I've forked, and started factoring all of the request-construction logic into separate functions, and was planning on defining a protocol containing only the top-level I/O stuff and providing asynchronous implementations which share more-or-less everything with the existing, synchronous ones.

I'm curious about how likely you'd be to accept a PR, assuming API compatibility is entirely maintained, or whether this falls outside of the scope of what you're trying to do with the library.

Adding faraday dependency causes very slow lein builds

When adding faraday to a lein project, I've found that lein becomes very slow to resolve dependencies. With faraday present lein deps :tree takes around 5 minutes for me, without faraday present the same call returns instantly.

I tracked this down to the following transitive dependency:

[com.taoensso/faraday "1.1.1"] 
    -> [com.amazonaws/aws-java-sdk "1.7.1"]
        -> [joda-time "[2.2,)"]

It appears that lein/aether handles dependency ranges very badly. I found this post that sheds some light on things:

http://nelsonmorris.net/2012/07/31/do-not-use-version-ranges-in-project-clj.html

I've fixed this using an exclusion for joda-time on faraday (I use clj-time in my project and this has an explicit dependency on joda-time anyway):

[com.taoensso/faraday "1.1.1" :exclusions [joda-time]]

but other people here have hit the same problem as soon as they start using faraday.

This isn't a problem with this library, but it certainly feels like one when new users add this dependency to their project. I wonder if it would be worthwhile overriding this range with a specific joda-time version?

update-table does nothing and returns no error message

Trying to use update-table and it is not doing anything nor returning any error messages. Am I doing something wrong or is there a defect here?

Here is the output from the REPL session:

dynamodb-client.core> (far/describe-table creds "myTableName")
{:name :myTableName,
 :creation-date #inst "2013-09-27T15:28:35.000-00:00",
 :item-count 0,
 :size 0,
 :throughput
 {:read 30,
  :write 30,
  :last-decrease nil,
  :last-increase nil,
  :num-decreases-today 0},
 :indexes nil,
 :status :active,
 :prim-keys {:id {:data-type :s, :key-type :hash}}}
nil
dynamodb-client.core> (def update-promise (far/update-table creds "myTableName" [200 100]))
#<Var@3814a36e: #<Promise@1e1127f1: nil>>
nil

dynamodb-client.core> (realized? update-promise)
true
nil

dynamodb-client.core> @update-promise
nil
nil

dynamodb-client.core> (far/describe-table creds "myTableName")
{:name :myTableName,
 :creation-date #inst "2013-09-27T15:28:35.000-00:00",
 :item-count 0,
 :size 0,
 :throughput
 {:read 30,
  :write 30,
  :last-decrease nil,
  :last-increase nil,
  :num-decreases-today 0},
 :indexes nil,
 :status :active,
 :prim-keys {:id {:data-type :s, :key-type :hash}}}
nil

:limit not working with query function

I'm trying to paginate results but my query doesn't seem to limit the result set as expected. Just to be clear I'm querying a table with a composite key of :id and :timestamp. The table has 25 items with the same :id.

(far/query client-opts table {:id [:eq id]} {:limit 10})                                            

Another thing. How do I retrieve the LastEvaluatedKey in my results.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.