Sampling bitstring probabilities of ITensorNetworks

Is there a way to get the coefficients of a wave function in the computational z-basis from an ITensorNetwork object? If I have an ITensorNetwork on a spin half graph, psi, the function sample!(psi) to sample the coefficients works for an MPS but not an ITensorNetwork. Is there one that works for an ITensorNetwork? Tried using projection operators to find the individual bitstring probabilities through expect(projectionOp, psi) where projectionOp for system size N is one of the 2^N projection operators in the computational basis but this becomes slow with scaling up. Also taking expectation values using the expect_state_SBP() function which would be faster doesn’t work for non-local projection operators as applying a more than two site operator on the state is not supported yet.

Julia Version 1.9.3
ITensors v0.3.44

It’s a good question. I don’t believe this has been generalized to ITensorNetworks yet. Because of the scaling issue that you point out with the expect-based approach, it’s really something that should be coded up properly for ITensorNetworks. That being said, I assume @mtfishman and Joey Tindall who are leading the effort would like it not to be coded as a one-off function the way it is in ITensors.jl, but rather using some of the newer designs they are pursuing that unify algorithms like sample with other algorithms such as dmrg.

If you are using a tree tensor network, @bkloss wrote a faster version of expect here: Efficient implementation of `expect` for `AbstractTTN` by b-kloss · Pull Request #129 · mtfishman/ITensorNetworks.jl · GitHub. Sampling would be easy to write in a similar way, but as @miles said we are trying to build a more systematic approach where we share code structures across various algorithms, instead of writing many one-off functions like in the MPS/MPO code currently in ITensors.jl (which has become difficult to maintain and generalize to new functionality due to the shear number of functions, a code design that doesn’t share enough pieces across different functions even though many of them have very similar code or algorithmic structures, etc.).

For general tensor networks, the question always comes down to what contraction algorithm you want to use. It would be easy to write an expect function or sampling function using BP, I would say we are still in an exploratory phase where we are trying to come up with correct data structures and code patterns for BP-based contractions and algorithms to easily implement functionality like computing expectation values, sampling, running algorithms like DMRG, TDVP, and gate evolution, etc. with only small modifications from each other. With that BP-based design, it would then trivially work when the network happens to be a tree, which would make it so we have very general functionality that works for both tree tensor networks and tensor networks with loops but with a small amount of code to maintain (while also making it easier to implement new algorithms).

1 Like

Thank you!

Thank you!