I would like to use the TDVP function under iTensorTDVP to evaluate a thermal state under a long-range Hamiltonian. My fermionic Hamiltonian is the following form, and I use opsum()
method to create it:
I first create a system+bath MPS where alternating sites belong to the system and bath, respectively, and then create a bell pair between adjacent system and bath sites to create an infinite temperature state. Then I use tdvp to perform imaginary time evolution. (Following the purification procedure in 1901.05824 sec2.7)
ttotal = 0.6
tau = 0.02;
ϕ = tdvp(H, -im*ttotal, ψ0;
time_step= -im*tau, #nsweeps=5,
reverse_step=false,
normalize=true, maxdim=250,cutoff=1e-10,outputlevel=3);
I measure the final energy using E_f = inner(ϕ',H,ϕ)
As the size of the simulation (ttotal) increases, the bond dimension increases and maxes out. However, I am not able to get the ground state energy for very long imaginary time evolution to match with DMRG results i.e by using :
e2, ϕ2 = ITensors.dmrg(H, ψ0; nsweeps=25, maxdim=250, cutoff=1e-10)
Am I missing something in the tdvp()
function? How do I verify that tdvp()
is doing what I want it to do? Like make sure that it gives the ground state for very long imaginary time evolution? Should I change the timestep “tau” ? Any suggestions would be very helpful!
Also, how do I check if this TDVP algorithm is 1-site or 2-site ? Can this be altered just be altering the argument nsite=1/2
in the tdvp()
function ?
Thanks!