norm or inner after addition for ITensorNetworks

I am fairly new to Itensors and was working with ItensorNetworks.jl. I wanted to calculate the norm/inner product after addition of two tensors, for example, \psi = \psi_1 + \psi_2. This gives me some weird behaviour that I do not understand.

When I do it on a 2D network having 1 loop the norm of \psi gives 1. But when I increase the network to having more than one loops the norm of \psi seems to be \frac{1}{\sqrt{2^{n-1}}}, where n is the number of loops in the network.

Here is a minimal working example

using LinearAlgebra: norm_sqr
using NamedGraphs: NamedEdge
using NamedGraphs.NamedGraphGenerators: named_grid
g1 = named_grid((2, 2)) #This has 1 loop
g2 = named_grid((3, 2)) #This has 2 loops
g3 = named_grid((3, 3)) #This has 4 loop

s = siteinds("S=1/2", g1)
ψ1 = ITensorNetwork(v -> "↑", s)
ψ2 = ITensorNetwork(v -> "↓", s)
ψ = ψ1 + ψ2
println("Norm when there is one loop=",norm(ψ))


s = siteinds("S=1/2", g2)
ψ1 = ITensorNetwork(v -> "↑", s)
ψ2 = ITensorNetwork(v -> "↓", s)
ψ = ψ1 + ψ2
println("Norm when there are two loops=",norm(ψ))

s = siteinds("S=1/2", g3)
ψ1 = ITensorNetwork(v -> "↑", s)
ψ2 = ITensorNetwork(v -> "↓", s)
ψ = ψ1 + ψ2
println("Norm when there are four loop=",norm(ψ))

Furthermore, lets say I normalize ψ by multiplying \sqrt{2^{n-1}}. If I now add another tensor to it in the subsequent step, then irrespective of the number of loops in the network the norm remains 1.

Here is an example that shows this,


using LinearAlgebra: norm_sqr
using NamedGraphs: NamedEdge
using NamedGraphs.NamedGraphGenerators: named_grid

  g2 = named_grid((3, 3)) #This has 4 loops
  s = siteinds("S=1/2", g2) 
  ψ1 = ITensorNetwork(v -> "↑", s)
  ψ2 = ITensorNetwork(v -> "↓", s)

  ψ = ψ1 + ψ2
  println("The norm without normalization=", norm(ψ))

  ψ = sqrt(8)*(ψ1 + ψ2)
  println("The norm after normalization=", norm(ψ))

  ψ = ψ + ψ2 + ψ1 # This does not require normalization
  println("The norm after doing addition again=", norm(ψ))

So, I am confused about two things:
(a) Whether + is indeed the simple addition of two tensors, i.e. if I just want to add wavefunctions \psi_1 + \psi_2 on a network, is just doing such an addition correct? Typically I would just put a 1/\sqrt{2} to normalize but here it seems that is not the case.
(b) Why does norm/inner depend on the number of loops of the underlying network after addition of the tensors? And why any subsequent additions are not affected?
Am I missing something?

1 Like

Hi! Thanks for the question.

So what’s going on here is that norm(ψ) passes to inner(ψ,ψ) which defaults to using the belief propagation contraction scheme which is approximate. Hence, if you pass it a loopy network it will only return an approximation to the norm which can be particularly bad when there are lots of loops.

If you modify your call to norm(ψ) with norm(ψ; alg = "exact") you will get the answer you expect. However, the computational complexity will scale very poorly with the size of the network.

The BP backend will scale much better but is only an approximation. However. if you pass it a tree it will be correct.

For instance:

using NamedGraphs: NamedGraphGenerators: named_comb_tree
g = named_comb_tree((nx,ny))
s = siteinds("S=1/2", g1)
ψ1 = ITensorNetwork(v -> "↑", s)
ψ2 = ITensorNetwork(v -> "↓", s)
ψ = ψ1 + ψ2
println("Norm when there is one loop=",norm(ψ))

returns sqrt(2) as expected.

Ideally this would be documented, but as the library is in a developmental stage we don’t want to make docs that could completely change.

It is confusing to see it return the wrong answer. However, if we made the default contraction scheme alg = "exact" then users would find there code just takes forever for even moderately sized systems. Perhaps in the future we might force the user to specify the algorithm in order to avoid confusion.

Hi,
Thank you very much for your reply. The norm/inner product is indeed fine when I set the algorithm explicitly to “exact”.

It seems to me that when I I act an operator on a tensor then the norm/inner product with belief propagation is fine, only when I try to add two tensors I face this problem with norm/inner product.
I just wanted to confirm whether this issue with belief propagation only pertains to addition of tensors or is it there for other operations as well?

In general, if the tensor network has “loops”, i.e. is_tree(tn::AbstractITensorNetwork) = false and the bond dimension is greater than one, i.e. maxlinkdim(tn::AbstractITensorNetwork) > 1 then any results you get from belief propagation are approximate and should be treated with caution. If you act an operator on a tensor network and the bond dimension remains 1 then you will still get correct results but if it grows then you might not.
It’s not an “issue” with belief propagation per say, it’s just how the algorithm works: it is exact only on trees or product states.

Thanks a lot for the details!