Comments (7)
Alright, it can be done.
Example usage
defmodule Sandbox do
defmodule Client do
use Tesla
end
defmodule TuplesClient do
use Tesla
plug Tesla.Middleware.Tuples
end
def main do
case Client.get("/") do
%{status: _status} -> :ok
end
end
def main_tuple do
case TuplesClient.get("/") do
{:ok, _env} -> :ok
{:error, _reason} -> :error
end
end
end
on curent master
⌘ ~/code/tesla/sandbox λ mix dialyzer
...
lib/sandbox.ex:18: Function main_tuple/0 has no local return
lib/sandbox.ex:20: The pattern {'ok', _env@1} can never match the type #{'__client__':=fun(), '__module__':=atom(), '__struct__':='Elixir.Tesla.Env', 'body':=_, 'headers':=#{binary()=>binary()}, 'method':='delete' | 'get' | 'head' | 'options' | 'patch' | 'post' | 'put' | 'trace', 'opts':=[any()], 'query':=[{atom() | binary(),binary() | [{atom() | binary(),binary() | [{_,_}]}]}], 'status':=integer(), 'url':=binary()}
lib/sandbox.ex:21: The pattern {'error', _reason@1} can never match the type #{'__client__':=fun(), '__module__':=atom(), '__struct__':='Elixir.Tesla.Env', 'body':=_, 'headers':=#{binary()=>binary()}, 'method':='delete' | 'get' | 'head' | 'options' | 'patch' | 'post' | 'put' | 'trace', 'opts':=[any()], 'query':=[{atom() | binary(),binary() | [{atom() | binary(),binary() | [{_,_}]}]}], 'status':=integer(), 'url':=binary()}
done (warnings were emitted)
on branch dependant-types
⌘ ~/code/tesla/sandbox λ mix dialyzer
...
done (passed successfully)
Required change to middleware
Explanation
I must admit this is quite a hack 💃
First, I've tried with @type return :: ...
in Tesla.Middleware.Tuples
but I can't seem to find an easy way to get to that information after the middleware module is compiled. AFAIK, dialyzer does some magic reading from beam files to support referencing types from other modules.
Then, I've tried with macros, but I've stumbled upon unexplainable errors from elixir compiler while compiling Tesla.Builder module.
Fortunately, since quote code is just data structure if can be used not only with macros, but also in regular functions - hence the def return_type
with quote
inside in Tesla.Middleware.Tuples
.
From this point the rest is fairly straightforward. First, I've replaced :: Tesla.Env.t
with type alias :: return_type
in Tesla.Builder
. Then, in __before_compile__
I check the last middleware in stack for return_type/0
function. Note that I must've used Code.ensure_loaded/1
to make function_exported?/3
works correctly. Then, if there is such function I simply call it, get the quoted type spec and inject it as @type return_type :: [here]
. In case there is no middleware, or the last one does not export custom return type there is a fallback to Tesla.Env.t
Here is the code for __before_compile
:
https://github.com/teamon/tesla/blob/8e8fd81e212530bb7cb73fc851faac9b54bc1621/lib/tesla.ex#L345-L370
And here is the full diff of changes:
So... what do you think? :)
from tesla.
The return type spec for get/post/...
is Tesla.Env.t
. When using Tuples middleware this is changed to {:ok, Tesla.Env.t} | {:error, any}
but this is not reflected in the typespec, hence the dialyzer error.
I can think of some way to check for Tuples middleware during compilation and adjust the generated typespecs but that would require figuring out a way to make this a general feature (i.e. one could make a custom middleware with custom return values that would need to reflected in the generated typespec).
from tesla.
I have workaround : use Tesla, docs: false
I had no knowledge of dialyzer when I wrote :only/:except/:docs
options... docs: false
unintentionally disables typespecs as well, so you can write your own inside your module.
from tesla.
use Tesla, docs: false
worked like a charm! Thank you!
I would like to keep this issue open for a few days, hoping to get more feedback on this. @teamon , is it okay?
from tesla.
Definitely, it should stay open. I should have opened this myself ~ two weeks ago 😞
from tesla.
@ahmadferdous have you got a change to try this out?
from tesla.
@ahmadferdous I'm going to close this one for now. I think the additional complexity is bigger than gains in this case. Feel free to reopen/comment.
from tesla.
Related Issues (20)
- Retry middleware: Is there a way to know which request is being retried? HOT 1
- Replace Application.get_env to Application.compile_env in module body
- Mint adapter is passing a 3 element error tuple to Tesla.request HOT 1
- Cryptic {:error, :closed} return from a POST HOT 3
- Compression middleware doesn't update `content-*` headers after decompression
- Fuck you HOT 2
- G
- Document the need for telemetry in the mix file HOT 3
- Mint proxy credentials HOT 1
- Issues Working with QuotaGuard QGTunnel HOT 3
- [Proposal] Lower the usage of macros HOT 2
- Setting retry delay on runtime. HOT 1
- Telemetry metadata doesn't have data about response in case of error
- Dialyzer failing on Tesla (master) HOT 3
- Tesla Response Stream is not compatible with Tesla Multipart upload
- Jason.Encoder protocol must always be explicitly implemented HOT 1
- Be more explict on `:no_scheme` error reason HOT 1
- When setting the adapter on runtime, mocking does not work
- Logger debug: true does nothing at runtime HOT 2
- Compiler warnings with Elixir v1.17.0-rc.0 HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tesla.