Giter VIP home page Giter VIP logo

daemonmode.jl's People

Contributors

bkaperick avatar dmolina avatar feanor12 avatar giordano avatar gsoleilhac avatar moelf avatar pallharaldsson avatar ranjanan avatar shuenhoy avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

daemonmode.jl's Issues

include can cause problems

I tried to include a local module in a script I ran via DaemonMode and I had some problems.

It seems in the script I have to use something like this to get it to work.

# instead of
include("../lib.jl")
# I had to use
Base.include(@__MODULE__,"../lib.jl")

The reason is that DaemonMode uses a module to encapsulate the code, but there is no local include defined for that module. To include external code one has to include code into the temporary module using Base.include.

Maybe the missing include is also a bug in julia.

New Version 0.1.1

@JuliaRegistrator register

Release notes:

  • Add parameter print_stack to server, that allows serve to indicate the complete stack when there is an error.

  • By default it shares the environment in expr, and not in files. It can be changed as a parameter in server.

  • Better behaviour when the client close the communication

  • Fix error with include in the script.

"DaemonMode::end" emitted when built into a sysimage

I just tried using DaemonMode with a sysimage, however I find that DaemonMode::end is emitted and the client process doesn't exit.

Sysimage construction

~$ julia --startup-file=no -e 'using PackageCompiler; PackageCompiler.create_sysimage([:DaemonMode]; sysimage_path="/home/tec/.local/lib/julia_daemon.so", project="/home/tec/.julia/config/sysimages/daemon_project", incremental=false, filter_stdlibs=true)'

daemon_project is just a project directory with just DaemonMode added.

Trying the client

~$ /home/tec/.julia/juliaup/bin/julia --sysimage=/home/tec/.local/lib/julia_daemon.so --startup-file=no -e "using DaemonMode; runargs()" /tmp/test.jl
Hello world
DaemonMode::end

At this point the process just sits there, if I ^C I then see: Error, cannot connect with server. Is it running?

eval() does not work with DaemonMode?

I was very happy to discover DaemonMode pkg!
One problem I just encounter is that it fails on eval()
Is this a user error? Can I somehow make it work?
Or is it a bug?
See below for simple example:

$ cat test.jl
println("test the daemon!")
v = eval(Meta.parse("pi"))
println("test the daemon! v= ", v)
$ julia -e 'using DaemonMode; runargs()' test.jl
est the daemon!
31mERROR: LoadError: MethodError: no method matching eval(::Symbol)
Stacktrace:
 [1] top-level scope at /private/tmp/test.jl:2

$ julia test.jl 
test the daemon!
test the daemon! v= π
$ 

Versions:

julia> versioninfo()
Julia Version 1.5.3
Commit 788b2c77c1 (2020-11-09 13:37 UTC)
Platform Info:
  OS: macOS (x86_64-apple-darwin18.7.0)
  CPU: Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-9.0.1 (ORCJIT, skylake)
julia>

Different output from DaemonMode when compared to native julia

I tried to run a file containing the following

println("Hello World")

import Pkg

Pkg.status()

The output for DaemonMode (async=false; using runargs) looks like this:

      StatusHello World
 `/..../Project.toml`
  [d749ddd5] DaemonMode v0.1.10

and the output from julia looks like this

Hello World
      Status `/..../Project.toml`
  [d749ddd5] DaemonMode v0.1.10

It might be a race condition. I am using julia 1.7-rc3

Version v0.1.5

@JuliaRegistrator register

New version v0.1.5 with paralelism (multi-tasking)

After a busy weekend, I write you to inform of a new version v0.1.5, with multi-task running of clients.

In previous versions, if you run a a program with take some time, and you try run another one before the first had finished, the second one did not start until the first one was finished.

In version v0.1.5, all programs are run as tasks in parallel, so the second one can be started before the first one is finished.

Because I have consider that the new behaviour is a lot better than previous one, I have set the async mode active by default. However, you can run the server function with the parameter async=false to have the previous behaviour.

# Async mode
$  julia -e 'using DaemonMode; serve(async=true)'
# Sync mode (previous behaviour)
$  julia -e 'using DaemonMode; serve(async=false)'

This have several advantages:

  • You can run any new client without waiting the previous close.

  • If one process ask for close the Daemon, it will wait until all clients have
    finished.

  • Normal output (and standard error) are shown always to the corresponding program.

Disvantage:

  • If several clients are running at the same time, @info is shown by the last one. Because @info is usually only used for debugging, I think it is not a big problem.
  • Also, logs messages can be equally sent to the last one, so it is better to redirect logs messages to files.

The main problem in the development was the fact that redirect_stdout and redirect_stderr did not work right with tasks, because they change a global variable are not supported in a multi-tasking environment. Thus, the output (normal and error) was always sent to the last program to be run, and not to the program responsible of the output. For fixing that, I have defined print and stderr to redirect that info manually to the socket (using async code to be able to redirect the output in real-time). That approach cannot be replicated with macros, that is the reasons of the previous disvantages.

I hope you consider it very useful.

Version 0.1.6

@JuliaRegistrator register

New version v0.1.6

In version v0.1.5 DaemonMode is able to run each client in parallel, but not in several CPUs. In this version it is able to run each client in one different CPU.

$  julia -e 'using DaemonMode; serve(async=true)'

That command will allow to run different clients parallel, but it will use only one CPU.

If you want to use several threads, you can do:

$  julia -t auto -e 'using DaemonMode; serve(async=true)'

Auto allows DaemonMode to use all processors of the computer, but you can put -t 1, -t 2, ...

With several threads (indicated with -t), you can run several clients in different CPUs, without increasing the time for each client. If there is only one process, the processing time will be divided between the different clients.

Changes

  • Improve documentation in relation with async and threads.

Fixes

Is it possible builtin this daemon mode in official julia

The first time to plot issue will cause people treat Julia as a slow language comparing to MATLAB/R/Python. Unlike python, after first time run, python script can faster.
But, in Julia, every time launch will require compile all again. If this daemon mode is the default behavior, it will more friendly to new users.

Race conditions for printing when using async=true

The output to stdout sometimes returns a broken pipe error.
I also realized that redirect_stdout might not be a good idea to redirect in an concurrent environment as they influence the global Base.stdout variable.

I created a post on discourse asking how one would generate a module owned stdout. Maybe someone knows a good solution.
The one I found was writing to a temporary file instead of stdout by overwriting print and println. Maybe show can also be overwritten.

https://discourse.julialang.org/t/how-to-catch-stdout-per-module/59611

MethodError runargs

I get this with version 0.1.10 and julia 1.8.2:

ERROR: MethodError: no method matching runargs(; port=16180)
Closest candidates are:
  runargs() at C:\Users\...\.julia\packages\DaemonMode\fAZD4\src\DaemonMode.jl:576 got unsupported keyword argument "port"
  runargs(::Any) at C:\Users\...\.julia\packages\DaemonMode\fAZD4\src\DaemonMode.jl:576 got unsupported keyword argument "port"
Stacktrace:
 [1] top-level scope
   @ none:1

Where is the problem?

ERROR: LoadError: IOError: could not spawn setenv(...) operation not supported on socket (ENOTSUP)

While DaemonModes works at home on my provate computer, i get on my work computer in the office:

ERROR: LoadError: IOError: could not spawn setenv(... julia ... -Cnative ... -g1 -O0 --output-ji  ...) operation not supported on socket (ENOTSUP)
Stacktrace:
 [1] _spawn_primitive at .\process.jl:128
 [2] #725 at .\process.jl:139
 [3] setup_stdios at .\process.jl:223
 [4] _spawn at .\process.jl:138
 [5] _spawn at .\process.jl:166
 [6] #open#734 at .\process.jl:397
 [7] open at .\process.jl:366

Might this be related to anti virus software? I read something about this in web.

Connection to the DeamonMode server from client seems to be established.

EDIT:
With Julia 1.9 alpha the error vanishes.

exit() exits the server

Hello again,

I put exit(2) in a script, and noticed that running it with DaemonMode seemed to cause the server to exit, not the client. Since I feel scripts may reasonably use exit codes to signal different types of errors, this behaviour seems rather undesirable to me.

Remote execution via string

First, thank you for this excellent package!

I wonder if it's possible to support remote execution via ssh tunneling. I saw there is the ability to execute a string with runexpr() but it does a cd(dir) before executing the string. Maybe a simple change could provide a way for remote execution via ssh without too much trouble:
aminnj@eb84294
(Then one can bypass the cd(dir) by feeding cwd=".")

Remote machine

julia --startup-file=no -e "using DaemonMode; serve(3001)"

Local machine

# script to verify we're indeed executing on the remote machine
$ cat test.jl
println(run(`ls`))

# tunnel locally
ssh -N -f -L localhost:3001:localhost:3001 <remote-hostname>

# send the script as a string to be executed on the remote machine.
julia --startup-file=no -e 'using DaemonMode; runexpr(read("test.jl", String), cwd=".", port=3001)'

`@__FILE__` macro incorrectly expanded

Couldn't find mention of it so just logging here, the @__FILE__ macro is incorrectly expanded as "string" in a script file that is run with DaemonMode (v0.1.9) vs as a normal script.

No colored output

First of all, thanks for this incredible package. Certainly, it saves a lot of time from many Julia users.

I installed the package and is working great. The only issue I have is that there is no coloured output. I have a lot of code that print messages as warnings and hints to the terminal.

Throw on failure to connect to server

I'm not quite sure, but wouldn't it be better to fail with an exception when not being able to connect to the server, instead of just logging:

println(stderr, "Error, cannot connect with server. Is it running?")

My current use case is a simple bash script which falls back to using Julia without the daemonmode if not installed or running. I would then need to parse the output in stderr instead of using the return value.

Version 0.1.2

@JuliaRegistrator register

0.1.2 - (2021-02-05)

  • Problem when the client did not run a println, but only a print.

  • Fix backtrace with colors, and including the line of the file.

interruption with CTRL-C

Hi,
I use DaemonMode to run my unit tests.
If I interrupt the execution with CTRL-C, the client aborts but the server sometimes continues running the tests.
Is there any way to abort them?

I tried this, but I'm not even able to get into the "catch" to detect when the daemon execution should be aborted

        Base.exit_on_sigint(false)
        try
            runargs()
        catch e
            println(\"crashed\")
        end

thanks

Version 0.1.7

@JuliaRegistrator register

New version v0.1.7

New

  • Add option threaded to allow activate or desactivate to use threads with async.
  • The function exit() can be used inside the client.

Changes

Fixes

  • Usage of exit() function inside the client.
  • Error depending on the order of sending to socket the output and/or the return code.

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Check if server is running

It would be really nice to have a little bash script that effectively runs julia in daemon mode, starting a server if there's no server active. But is there a way to check from the command line if there's already a server going?

print causes hang

Running the example below through DaemonMode causes a hang and the following output after a Ctrl+c. This does not happen if I use println instead of print

# repro.jl
print("Hello world")
λ julia --startup-file=no --project=. -e "using DaemonMode; runargs()" repro.jl
hello worldDaemonMode::end
# <Ctrl + c>
Error, cannot connect with server. Is it running?

This is on Windows 10, Julia 1.5, DaemonMode 0.1.1

Version 0.1.1

@JuliaRegistrator register

Release notes:

  • Add parameter print_stack to server, that allows serve to indicate the complete stack when there is an error.

  • By default it shares the environment in expr, and not in files. It can be changed as a parameter in server.

  • Better behaviour when the client close the communication

  • Fix error with include in the script.

Add SIGINT handler?

Thanks for the package, I'll need to read through more to understand it all.

One idea is that it could be convenient to handle SIGINT. Currently we have to press ^C twice to kill the server and it's a bit noisy:

%zsh: jl --startup-file=no -e 'using DaemonMode; serve()'
^CUnhandled Task ERROR: InterruptException:
Stacktrace:
 [1] poptask(W::Base.InvasiveLinkedListSynchronized{Task})
   @ Base ./task.jl:827
 [2] wait()
   @ Base ./task.jl:836
 [3] wait(c::Base.GenericCondition{ReentrantLock})
   @ Base ./condition.jl:123
 [4] #134
   @ /usr/share/julia/stdlib/v1.7/Distributed/src/remotecall.jl:281 [inlined]
 [5] lock(f::Distributed.var"#134#136", l::ReentrantLock)
   @ Base ./lock.jl:190
 [6] lock
   @ ./condition.jl:78 [inlined]
 [7] macro expansion
   @ /usr/share/julia/stdlib/v1.7/Distributed/src/remotecall.jl:279 [inlined]
 [8] (::Distributed.var"#133#135")()
   @ Distributed ./threadingconstructs.jl:178
^Cfatal: error thrown and no exception handler available.
InterruptException()
jl_mutex_unlock at /buildworker/worker/package_linux64/build/src/julia_locks.h:129 [inlined]
jl_task_get_next at /buildworker/worker/package_linux64/build/src/partr.c:484
poptask at ./task.jl:827
wait at ./task.jl:836
task_done_hook at ./task.jl:544
_jl_invoke at /buildworker/worker/package_linux64/build/src/gf.c:2247 [inlined]
jl_apply_generic at /buildworker/worker/package_linux64/build/src/gf.c:2429
jl_apply at /buildworker/worker/package_linux64/build/src/julia.h:1788 [inlined]
jl_finish_task at /buildworker/worker/package_linux64/build/src/task.c:218
start_task at /buildworker/worker/package_linux64/build/src/task.c:888

Print full stacktrace when script throws

Currently the client only receives a single line about the exception, and that's everything to tell the user. IMHO we can print a full stacktrace to the socket; I can change it together in #7, but this is going to change behavior more significantly. So do we want such a change?

Version 0.1.8

@JuliaRegistrator register
New version v0.1.8

0.1.8 - (2021-05-29)

New

  • Improve documentation with PackageCompiler.

Fixes

  • Fix annoying error message for closing process during output.

Log output in server not in client

Thank you for this cool package. I'm using it in combination with dvc, which starts a new julia process for every pipeline step. DaemonMode really helps here and speeds everything up significantly!

One thing I noticed, however:
Any logs written with @info are output in the server process, which is a little confusing.

Version 0.1.9

@JuliaRegistrator register
New version v0.1.9

0.1.9 - (2021-07-27)


New

Automatically reload the modified packages

DaemonMode would execute the codes that are directly passed to the server, so each time the codes are updated, you would get the up-to-date results. However, sometimes you may also be developing some packages in the same time, and want they got reloaded when modified. You can use Revise together with DaemonMode for this purpose. You only need to add using Revise, before starting the DaemonMode server:

julia --startup-file=no -e 'using Revise; using DaemonMode; serve()'

Poor timing on print

DaemonMode works quite well according to TTFX. Nevertheless I get strange timing on println(), i.e. the order of print statements is not retained. Even if I use flush(stdout) nothing changes.

Is there any workaround for this?

Version 0.1.2

@JuliaRegistrator register

0.1.2 - (2021-02-05)

  • Problem when the client did not run a println, but only a print.

  • Fix backtrace with colors, and including the line of the file.

Handle `-e` / `-E` arguments.

By using -- in the shell alias, we can get flags to be included in ARGS

alias juliaclient='julia --startup-file=no -e "using DaemonMode; runargs()" --'

This way one could do juliaclient -e 'println("Hi")' etc.

I imagine DaemonMode's runargs() could be extended to recognise -e, -E, and --project perhaps?

Exit code from a script

Is it possible to return the correct exit code when using runfile()?

Maybe overwriting exit with a return would enable to exit correctly without killing the server.

exit(x) = return x
exit(4)

Server crashes when sent this command.

If I run this line the DaemonMode server crashes:
julia --project -e "using DaemonMode" -e 'runexpr("if true ; end\n")'

In the REPL the expression works when run like this:
julia> eval(Meta.parse("if true ; end\n"))

I think this may be fixed at (

serverReplyError(e)
) by replacing it with a two argument call

Server output:

ERROR: MethodError: no method matching serverReplyError(::Base.Meta.ParseError)
Closest candidates are:
  serverReplyError(::Any, ::Any) at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:57
  serverReplyError(::Any, ::Any, ::Any) at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:71
Stacktrace:
 [1] serverRunExpr(::Sockets.TCPSocket, ::Bool, ::Bool) at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:175
 [2] serve(::Int64, ::Missing; print_stack::Bool) at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:43
 [3] serve at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:32 [inlined] (repeats 2 times)
 [4] top-level scope at none:1
caused by [exception 1]
Base.Meta.ParseError("extra token after end of expression")
Stacktrace:
 [1] parse(::String; raise::Bool, depwarn::Bool) at ./meta.jl:220
 [2] parse at ./meta.jl:215 [inlined]
 [3] serverRunExpr(::Sockets.TCPSocket, ::Bool, ::Bool) at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:167
 [4] serve(::Int64, ::Missing; print_stack::Bool) at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:43
 [5] serve at /user/bla/.julia/packages/DaemonMode/lrn5P/src/DaemonMode.jl:32 [inlined] (repeats 2 times)
 [6] top-level scope at none:1

Passing paths as ARGS on Windows

The following script doesn't work on my computer

# script.jl
println(ARGS)
julia --project --startup-file=no -e "using DaemonMode; runargs()" .\script.jl "D:\w"
PAUSE

The error message is

ERROR: ERROR: MethodError: no method matching (::Colon)(::Int64, ::Nothing)

Whereas this is fine

julia --project .\script.jl "D:\w"
PAUSE

It seems that DaemonMode tries to process the ARGS and has problems with the windows path delimiters ''?

Processes don't always terminate

Hi. I have been happily using this application, but a bug has crept in (I don't know what I did differently).

Some of the time, if I send a job to the daemon using 'at' (the cron daemon in linux) the jobs execute OK but don't terminate, and then seem to continue indefinitely, continuing to create bigger and bigger files (as in many tens of GBs) in the spool.

I made sure to include quit() at the end of all of my scripts, but is there another way to make sure they exit?

Also, has anyone else had this problem?

Thanks for any help.

PS After writing this, I found out what seems to be the problem, which was the server to which my program was connecting, which was hanging. When the server/connection is OK the program terminates as it should, so no bug, really (except perhaps for the filling up of my hard-drive when this happens!).
I think it would be very helpful if it were possible to set a timeout on the daemon-executed script.

PPS Since writing this, I have found a solution for Linux users, which is the 'timeout' utility
(see https://stackoverflow.com/questions/2387485/limiting-the-time-a-program-runs-in-linux)
In this case, that would amount to:

timeout 10s julia --startup-file=no -e 'using DaemonMode; runargs()' program.jl <arguments>

Add a REST HTTP API

This should also solve the remote execution problem, and it'll also make everything much faster, as currently you need to julia --startup-file=no -e 'using DaemonMode; runargs()' program.jl <arguments> which has huge overhead compared to an HTTP request:

❯ time julia --startup-file=no --print "pi"
π
julia --startup-file=no --print "pi"  0.30s user 1.22s system 25% cpu 6.014 total

❯ time (print -nr -- '{"cmd":"echo hi from zsh","stdin":"\n","verbose":"0"}' | curl --fail --silent --location --header 'Content-Type: application/json' --request POST --data @- http://127.0.0.1:7230/zsh/
) # my own version of this package for the zsh language
hi from zsh
( print -nr -- '{"cmd":"echo hi from zsh","stdin":"\n","verbose":"0"}' | curl)  0.01s user 0.01s system 36% cpu 0.042 total

I think the HTTP API can be a separate package. I have implemented my own zsh version using two Python packages, Brish which is similar to this DaemonMode.jl, and BrishGarden which uses Brish to implement an HTTP API with multithreading. I'm open to implementing an HTTP API over DaemonMode.jl, too, but I don't know when I'll have the time.

Enable separate stderr and stdout

At the moment those two channels are pooled into the output buffer.

When calling an external tool sometimes it is useful to have split output channels.

Error running scripts calling `@info`

Consider the script test.jl

x = 123
@info x

Running the script like julia --project=. --startup-file=no test.jl yields the expected output

[ Info: 123

However, starting the daemon server by julia --startup-file=no --project=. -e "using DaemonMode; serve()", and running the script by julia --project=. --startup-file=no -e"using DaemonMode; runargs()" test.jl, gives the following error

ERROR: LoadError: MethodError: no method matching split(::Int64, ::String)
Stacktrace:
 [1] mylog at /Users/jonalm/.julia/packages/DaemonMode/fAZD4/src/DaemonMode.jl:231
 [2] #handle_message#19 at /Users/jonalm/.julia/packages/LoggingExtras/zT9ZU/src/formatlogger.jl:57
 [3] handle_message at /Users/jonalm/.julia/packages/LoggingExtras/zT9ZU/src/formatlogger.jl:52
 [4] #handle_message#13 at /Users/jonalm/.julia/packages/LoggingExtras/zT9ZU/src/minlevelfiltered.jl:17
 [5] handle_message at /Users/jonalm/.julia/packages/LoggingExtras/zT9ZU/src/minlevelfiltered.jl:16
 [6] top-level scope at logging.jl:353
 [7] eval at ./boot.jl:373

Do you care for a logo for DaemonMode

I want to maybe add DaemonMode to Juliawin: https://github.com/heetbeet/juliawin and so that the user has an option to run julia scripts with something like julia-daemon name_of_script.jl in stead of julia name_of_script.jl. I wrap all my executables in nice icons for a pleasant experience. Do you like any of the following logos? And do you mind me associating one of them with your project.
image

Do you perhaps have a logo I just couldn't find?
If not, do you want this logo after I'm finish designing it?
Do you have some suggestions and critiques?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.