Hi!
I am just asking about another memory issue here with ITensors.jl. So I wrote a code for simulating time evolution of a spin-1/2 system with both n.n. and n.n.n couplings. I ran it on an HPC, and noticed that the memory could be a bit large and sometimes exceeded the memory limit and got killed. Here is the code:
let
# Parameters
N = parse(Int64,ARGS[1]);
T = 1
h = 1*pi*(1-0.1)/T
M = (1-0.02)*pi/T
J = 1/2/T
ω = 2*pi/T
t_tot = parse(Int64,ARGS[2])*T
dt = 0.01
measure_step = 100
Nsteps = Int(t_tot/dt);
Nperiods = Int(Nsteps/measure_step)
maxdim = parse(Int64,ARGS[3];)
path = string(ARGS[4]);
cut_off = 1E-8;
# MPS time evolution
## Initial MPS preparation
s = siteinds("S=1/2",N;conserve_qns=false)
ψ = MPS(ComplexF64,s);
for i=1:N
ψ[i][1,1,1]=1;
ψ[i][1,2,1]=0;
end
orthogonalize!(ψ,1);
# Initial measurement
Szlist = Array{Float64,1}(undef,0);
tlist = Array{Float64,1}(undef,0);
temp = sum(2*expect(ψ,"Sz";sites=1:2:N))/(N/2);
append!(tlist,0)
append!(Szlist,temp)
for k = 1:Nperiods
ψ = single_period_Trotter_1st_order(J,h,N,s,ω,M,
T,
dt,
ψ,
cut_off,
k,
maxdim)
temp1 = sum(2*expect(ψ,"Sz";sites=1:2:N))/(N/2);
append!(Szlist,temp1)
append!(tlist,k*measure_step*dt)
# Save data
# print(k)
filename = "N"*ARGS[1]*"ttotal"*ARGS[2]*"maxdim"*ARGS[3]*".h5"
h5open(path * "/" * filename,"w") do file
g = create_group(file, "Parameters") # create a group
g["N"] = N
g["T"] = T
g["h"] = h
g["M"] = M
g["dt"] = dt
g["ttotal"] = t_tot
# g["initialstate"] = initial_state_type
g["cutoff"] = cut_off
g["maxbd"] = maxdim
g["measurestep"] = measure_step
R = create_group(file,"Results")
write(file, "Results/Mz", Szlist)
write(file, "Results/tlist", tlist)
# attributes(g)["Description"] = "This group contains only a single dataset" # an attribute
end
end
Note that the function \texttt{single\_period\_Trotter\_1st\_order} is similar to the example given in the ITensors.jl documentation, and the key step of the apply all those Trotterized gates is:
# e^{-i H_{odd} dt}|ψ>
ψ = apply(UOdd,ψ;cutoff=cutoff,maxdim=maxdim)
# e^{-i H_{even} dt}|ψ>
ψ = apply(UEven,ψ;cotoff=cutoff,maxdim=maxdim)
and I am trying to save the data on the fly with HDF5.
It is found that for N=24, it could already take up to more than 64 GB’s memory, and for N=56, it exceeded 80 GB (PBS killed in the middle so I wasn’t able to know the upper bound). May I ask is it normal, or is there anything I could optimize my code to save the memory? Or it has something to do with the HDF5 saving? Here, I use \texttt{maxdim}=100 for all calculations.
Thanks so much!
Tianqi Chen