# QR Factorization of singular matrix constructed from SVD

TL;DR Applying QR factorization to \Sigma matrix returned from an SVD throws an error. Minimal wokring example -

using ITensors

i = Index(4, "i")
j = Index(4, "j")
k = Index(4, "k")
A = randomITensor(i, j, k)

U, Σ, V = svd(A, i)

ITensors.qr(Σ, inds(Σ)[1])


Error gotten is -

MethodError: similar(::NDTensors.DiagTensor{Float64, 2, Tuple{Index{Int64}, Index{Int64}}, NDTensors.Diag{Float64, Vector{Float64}}}, ::Type{Float64}, ::Tuple{Int64, Int64}) is ambiguous. Candidates:
similar(a::AbstractArray, ::Type{T}, dims::Tuple{Vararg{Int64, N}}) where {T, N} in Base at abstractarray.jl:806
similar(a::AbstractArray, ::Type{T}, dims::Tuple{Integer, Vararg{Integer}}) where T in Base at abstractarray.jl:804
similar(a::AbstractArray, ::Type{T}, dims::Tuple{Union{Integer, Base.OneTo}, Vararg{Union{Integer, Base.OneTo}}}) where T in Base at abstractarray.jl:803
similar(A::AbstractArray, ::Type{T}, shape::Tuple{Union{Integer, AbstractUnitRange}, Vararg{Union{Integer, AbstractUnitRange}}}) where T in OffsetArrays at /home/mintycocoa/.julia/packages/OffsetArrays/80Lkc/src/OffsetArrays.jl:320
similar(T::NDTensors.Tensor, args...) in NDTensors at /home/mintycocoa/.julia/packages/NDTensors/c2BpJ/src/tensor.jl:132
Possible fix, define
similar(::NDTensors.Tensor, ::Type{T}, ::Tuple{Int64, Vararg{Int64, N}}) where {T, N}

Stacktrace:
[1] copy_similar(A::NDTensors.DiagTensor{Float64, 2, Tuple{Index{Int64}, Index{Int64}}, NDTensors.Diag{Float64, Vector{Float64}}}, #unused#::Type{Float64})
@ LinearAlgebra ~/.julia/juliaup/julia-1.8.0-rc4+0.x64/share/julia/stdlib/v1.8/LinearAlgebra/src/LinearAlgebra.jl:387
[2] qr(::NDTensors.DiagTensor{Float64, 2, Tuple{Index{Int64}, Index{Int64}}, NDTensors.Diag{Float64, Vector{Float64}}}, ::Tuple{Int64}, ::Vararg{Tuple{Int64}}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ LinearAlgebra ~/.julia/juliaup/julia-1.8.0-rc4+0.x64/share/julia/stdlib/v1.8/LinearAlgebra/src/qr.jl:427
[3] qr(::NDTensors.DiagTensor{Float64, 2, Tuple{Index{Int64}, Index{Int64}}, NDTensors.Diag{Float64, Vector{Float64}}}, ::Tuple{Int64}, ::Tuple{Int64})
@ LinearAlgebra ~/.julia/juliaup/julia-1.8.0-rc4+0.x64/share/julia/stdlib/v1.8/LinearAlgebra/src/qr.jl:425
[4] qr(A::ITensor, Linds::Index{Int64}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ ITensors ~/.julia/packages/ITensors/OjQuG/src/decomp.jl:355
[5] qr(A::ITensor, Linds::Index{Int64})
@ ITensors ~/.julia/packages/ITensors/OjQuG/src/decomp.jl:334
[6] top-level scope
@ In[56]:9
[7] eval
@ ./boot.jl:368 [inlined]
[8] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)


I am an undergraduate student in physics just starting out with tensor networks, and to get my head around matrix product states I was trying to write a function to decompose a tensor into a matrix product state by repeated SVD. So for instance, if we have a tensor T^{s_1s_2s_3s_4}, I can write it as

T^{s_1s_2s_3s_4} = U^{s_1}_{\alpha_1} \Sigma^{\alpha_1}_{\beta_1} (V^\intercal)^{\beta_1}_{\gamma_1} \tau^{\gamma_1s_2s_3s_4}

And then factorize \Sigma and absorb one half into U and the other half into V^\intercal. However when trying to factorize \Sigma as Q, R = qr(Σ, inds(Σ)[1]) it throws an error shown above. Any help would be greatly appreciated.

It’s a good question but the best approach here for you can be answered on a few different levels.

1. actually the most useful approach to decomposing a tensor by a sequence of SVD’s involves just multiplying the singular value matrix into V (if you are going from left to right), and not taking the square root of \Sigma and putting those square roots into both U and V. Leaving U untouched is the best approach, because then the resulting MPS tensor (made from U) has the “left orthogonal” property and iterating this approach will result in an MPS in the left-orthogonal form, which is a very useful form of an MPS.

2. a related thing to know is that ITensor already has a routine to transform a tensor into an MPS: you just call the MPS constructor and pass the tensor and collection of site indices, as discussed here: https://itensor.github.io/ITensors.jl/stable/examples/MPSandMPO.html#Creating-an-MPS-from-a-Tensor

3. the \Sigma matrix is already upper-triangular, since it is actually diagonal, so its QR decomposition can be computed without any extra work and would just be that Q equals the identity matrix and R equals \Sigma. So in that sense if you did have a reason to compute the QR of \Sigma, you could just define Q=I and R=\Sigma and skip doing any actual computation.

4. finally, the reason for the error you are getting is that internally \Sigma as returned by the SVD uses “sparse” (“diagonal sparse”) storage rather than regular dense storage and we haven’t defined the QR factorization for dense tensors yet. So as a workaround if you do need to do this you can first convert \Sigma to a dense tensor by calling

Σ = dense(Σ)


then calling qr on it and now that should work.

1 Like

Hello!

I know this is a very late reply, however I want to thank you so much for the detailed response!

1 Like