Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
686 views
in Technique[技术] by (71.8m points)

parallel processing - Distributed calculus in julia with python imported code: UndefVarError: __anon__ not defined

I have this code (file name is test.jl) which is a simplified version of a more complex code:

using Distributed, SharedArrays
using LinearAlgebra
using PyCall
@everywhere @pyimport scipy.optimize as so

function fun()                              # Coupling constants

    
    Ntraj = 2
    Ntime = 10
    
    result = @distributed (+) for ktraj = 1 : Ntraj
    println("Step_1")

        
    # One cycle one trajectory
    for jt=1:Ntime
            println(jt)
            fidelity = x -> x[1]*x[2]*x[3]-x[1]-x[2]-x[3]
            x0 = [0 0 0]
            println(fidelity(x0))
            println(so.minimize(fidelity,x0))                                

    end
    end

    return 0
end

I call from the notebook in the following way

using Distributed
using PyCall
@pyimport scipy.optimize as so

addprocs(2)

@everywhere include("test.jl")
@time fun()

Apart from the warnings

┌ Warning: `@pyimport foo` is deprecated in favor of `foo = pyimport("foo")`.
│   caller = _pywrap_pyimport(::PyObject) at PyCall.jl:410
└ @ PyCall ~/.julia/packages/PyCall/tqyST/src/PyCall.jl:410
┌ Warning: `@pyimport foo` is deprecated in favor of `foo = pyimport("foo")`.
│   caller = _pywrap_pyimport(::PyObject) at PyCall.jl:410
└ @ PyCall ~/.julia/packages/PyCall/tqyST/src/PyCall.jl:410
┌ Warning: `@pyimport foo` is deprecated in favor of `foo = pyimport("foo")`.
│   caller = _pywrap_pyimport(::PyObject) at PyCall.jl:410
└ @ PyCall ~/.julia/packages/PyCall/tqyST/src/PyCall.jl:410
┌ Warning: `@pyimport foo` is deprecated in favor of `foo = pyimport("foo")`.
│   caller = _pywrap_pyimport(::PyObject) at PyCall.jl:410
└ @ PyCall ~/.julia/packages/PyCall/tqyST/src/PyCall.jl:410
┌ Warning: `@pyimport foo` is deprecated in favor of `foo = pyimport("foo")`.
│   caller = _pywrap_pyimport(::PyObject) at PyCall.jl:410
└ @ PyCall ~/.julia/packages/PyCall/tqyST/src/PyCall.jl:410


┌ Warning: `@pyimport foo` is deprecated in favor of `foo = pyimport("foo")`.
│   caller = _pywrap_pyimport(::PyObject) at PyCall.jl:410
└ @ PyCall ~/.julia/packages/PyCall/tqyST/src/PyCall.jl:410

I receive this error

TaskFailedException:
On worker 2:
UndefVarError: __anon__ not defined
deserialize_module at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:915
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:812
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:735
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:820
deserialize_fillarray! at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:1112
deserialize_array at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:1104
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:786
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:735 [inlined]
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:1010
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:878
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:735 [inlined]
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:947
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:878
deserialize_fillarray! at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:1112
deserialize_array at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:1104
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:786
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:735 [inlined]
deserialize_typename at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:1177
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/clusterserialize.jl:68
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:878
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:735
deserialize_datatype at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:0
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:790
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:735
handle_deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:795
deserialize at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Serialization/src/Serialization.jl:735 [inlined]
deserialize_msg at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/messages.jl:99
#invokelatest#1 at ./essentials.jl:712 [inlined]
invokelatest at ./essentials.jl:711 [inlined]
message_handler_loop at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/process_messages.jl:185
process_tcp_streams at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/process_messages.jl:142
#97 at ./task.jl:358
Stacktrace:
 [1] remotecall_fetch(::Function, ::Distributed.Worker, ::Function, ::Vararg{Any,N} where N; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/remotecall.jl:390
 [2] remotecall_fetch(::Function, ::Distributed.Worker, ::Function, ::Vararg{Any,N} where N) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/remotecall.jl:382
 [3] remotecall_fetch(::Function, ::Int64, ::Function, ::Vararg{Any,N} where N; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/remotecall.jl:417
 [4] remotecall_fetch at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/remotecall.jl:417 [inlined]
 [5] (::Distributed.var"#155#156"{typeof(+),var"#9#12"{Int64},UnitRange{Int64},Array{UnitRange{Int64},1},Int64,Int64})() at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/macros.jl:270

Stacktrace:
 [1] wait at ./task.jl:267 [inlined]
 [2] fetch at ./task.jl:282 [inlined]
 [3] iterate at ./generator.jl:47 [inlined]
 [4] collect(::Base.Generator{Array{Task,1},typeof(fetch)}) at ./array.jl:665
 [5] preduce(::Function, ::Function, ::UnitRange{Int64}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.4/Distributed/src/macros.jl:274
 [6] fun at /home/candeloro/feedback_qw/Feedback_Adjacency_2Measurements_N5_5HF/test.jl:12 [inlined]
 [7] macro expansion at ./util.jl:175 [inlined]
 [8] top-level scope at ./In[6]:8

I don't really know hot to fix it

question from:https://stackoverflow.com/questions/65849286/distributed-calculus-in-julia-with-python-imported-code-undefvarerror-anon

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Here is a code after cleanup that works. Basically, the main problem is how the @distributed macro was trying to move the Python module around the cluster (it seems it does not know it is a library). So I packed it into a function which is always called locally at each given worker process (no risk of copying).


using Distributed, LinearAlgebra, PyCall
addprocs(2)
@everywhere using Distributed, LinearAlgebra, PyCall

@everywhere const myso = pyimport("scipy.optimize")
@everywhere getso() = myso

function fun()
    Ntraj = 2
    Ntime = 10
    result = @distributed (+) for ktraj = 1 : Ntraj
        for jt=1:Ntime
            println(jt)
            fidelity = x -> x[1]*x[2]*x[3]-x[1]-x[2]-x[3]
            x0 = [0 0 0]
            println(fidelity(x0))
            println(getso().minimize(fidelity,x0))                                
        end
        1 # you need to add something or use @sync @distributed for ktraj = 1 : Ntraj
    end
end

fun()

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...