You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is an issue for tracking functionality that has been removed, renamed, or changed in the rewrite, which should be brought back and/or properly deprecated (with deprecation warnings or errors).
Definitely bring back
ITensors.delta/ITensors.δ constructors for constructing diagonal ITensors with uniform diagonal values of 1. They should be overloads/wrappers around the same functions in DiagonalArrays.jl, see Delta tensors DiagonalArrays.jl#6.
NDTensors.BackendSelection.Algorithm/NDTensors.BackendSelection.@Algorithm_str should be removed from here and moved to a new package BackendSelection.jl.
NDTensors.denseblocks was defined in NDTensors.jl to convert a block sparse array to the same block sparse array structure but with dense blocks (say of the blocks themselves were diagonal or sparse in some other way). I think this is useful, it should be defined in BlockSparseArrays.jl and based on a generic function dense in SparseArraysBase.jl that can generically convert a sparse array to dense while trying to preserve properties of the sparse array, like if it was allocated on GPU originally.
ITensors.apply/ITensors.product for applying operator-like ITensors/tensor networks as linear maps. These should get reimplemented in terms of a more general index mapping system, since they rely heavily on prime levels and implicit conventions around those.
ITensors.splitblocks was used to split the blocks of a block sparse tensor in a certain dimension down to blocks of length 1, which is useful for making Hamiltonian tensor network operators more sparse. Define something similar in BlockSparseArrays.jl (maybe consider a different name but I can't think of a better one).
ITensors.factorize was used as a convenient abstraction layer on top of SVD, QR, eigen, etc. to select the best factorization based on the desired orthogonality and truncation. Bring something like that back in MatrixAlgebra.jl/TensorAlgebra.jl and wrap it in NamedDimsArrays.jl. An initial stand-in version to start getting ITensorMPS.jl functionality working was started in Implement more missing functionality #15.
Move to a separate package
ITensors.ContractionSequenceOptimization has an optimal contraction sequence finder, that should be split off into a more general package.
NDTensors.random_unitary: bring back in MatrixAlgebra.jl/TensorAlgebra.jl, etc. There is a simplistic version in QuantomOperatorDefinitions.jl but the implementation should be moved out to a library where it can be shared.
Bring back with a new name and deprecate the old name
ITensors.directsum needs to be brought back. We may only need to define Base.cat for NamedDimsArrays.AbstractNamedDimsArray where you can specify named dimensions to direct sum, but the interface is a bit different so directsum could be a more convenient interface for Base.cat. For now I'll start by keeping things minimal and just define a named dimension version of Base.cat, and see if we need directsum, and if not we can deprecate it.
ITensors.diag_itensor should be brought back, but it should be renamed to diagonalarray, i.e. diagonalarray(diag, i, j, k), which should be an overload of the function with the same name that we will define in DiagonalArrays.jl, see [ENHANCEMENT] Define a generic diagonalarray constructor DiagonalArrays.jl#8.
NDTensors.enable_threaded_blocksparse/NDTensors.disable_threaded_blocksparse (also callable as ITensors.f). Probably this will be turned into a contraction backend of BlockSparseArrays, and will have a different interface.
Maybe bring back
ITensors.Index(length::Int, tags::String) has been removed for now, since tags were changed to a dictionary structure in Store tags in a dictionary #20 so it isn't clear what the string parsing should do. We could bring back tag sets for backwards compatibility but I wanted to assess the new design first and keep the constructors minimal.
ITensors.hasind(a, i)/ITensors.hasinds(a, is). I'm in favor of deprecating these in favor of i ∈ inds(a) and is ⊆ inds(a), we can remove them for now and bring them back as deprecations.
ITensors.itensor was a constructor that tried to make a view of the input data, while ITensor was a copying constructor. For now I removed that distinction and am just using ITensor. We can assess if itensor is needed, and either bring it back or deprecate it.
ITensors.outer/NDTensors.outer was functionality for getting outer products of tensors. Either bring it back in TensorAlgebra.jl in some form, or get rid of it and just use TensorAlgebra.contract (see [ENHANCEMENT] API for tensor/outer products TensorAlgebra.jl#17).
ITensors.contract[!]. Currently we just have *, and contract is defined in TensorAlgebra.jl as TensorsAlgebra.contract. Consider just using *, and LinearAlgebra.mul!/ArrayLayouts.muladd! for in-place contraction of named dimension arrays. Also, decide what to do about non-binary/network contractions.
NDTensors.enable_auto_fermion, NDTensors.using_auto_fermion, etc. need to be brought back in some form once we start reimplementing the automatic fermion sign system in the new code.
ITensors.convert_leaf_eltype to convert the scalar type of nested data structures. Find a better name or just use Adapt.adapt for that functionality.
Deprecate and delete
ITensors.TagSets.TagSet/ITensors.TagSets.@ts_str were used to parse string inputs like "a,b,c" into sets of tags {"a", "b", "c"}, either at runtime or compile time. With the new named tag/tag dict design introduced in Store tags in a dictionary #20 it isn't clear if we want or need that/what the syntax would be. For now we should remove uses of that and reassess bringing those back as needed.
ITensors.OneITensor should be deleted in favor of a scalar ITensor type. I think it is just internal so doesn't need to be deprecated, but we'll need to see where it is being used and change those parts of the code.
ITensors.IndexSet has been deprecated for a while but is still around for backwards compatibility, we should delete it. It could be replaced by an internal typedef struct IndexCollection = Union{Vector{<:Index},Tuple{Vararg{Index}}}.
ITensors.Apply is a lazy version of ITensors.apply/ITensors.product for applying operator-like ITensors/tensor networks as linear maps. Decide if we still want/need that and if not delete it or deprecate it.
ITensors.order/NDTensors.order was a different name for Base.ndims, I think we should deprecate it and just use Base.ndims (same for the compile time version ITensors.Order, which could be NDims if needed).
ITensors.permute(a, (j, i)) for permuting the ITensor data to a certain ordering. Deprecate in favor of NamedDimsArrays.aligndims.
ITensors.dim was the name for getting the length of Index and ITensor, we should deprecate it and just use Base.length and Base.to_dim instead.
ITensors.complex! was an in-place version of Base.complex for ITensors. I think it can be deprecated/removed.
The text was updated successfully, but these errors were encountered:
mtfishman
changed the title
[ENHANCEMENT] Define onehot
[ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jl`
Jan 16, 2025
mtfishman
changed the title
[ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jl`
[ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jlJan 16, 2025
mtfishman
changed the title
[ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jl
[ENHANCEMENT] Functionality to bring back or deprecate
Jan 28, 2025
This is an issue for tracking functionality that has been removed, renamed, or changed in the rewrite, which should be brought back and/or properly deprecated (with deprecation warnings or errors).
Definitely bring back
ITensors.delta
/ITensors.δ
constructors for constructing diagonal ITensors with uniform diagonal values of 1. They should be overloads/wrappers around the same functions inDiagonalArrays.jl
, see Delta tensors DiagonalArrays.jl#6.NDTensors.BackendSelection.Algorithm
/NDTensors.BackendSelection.@Algorithm_str
should be removed from here and moved to a new packageBackendSelection.jl
.NDTensors.denseblocks
was defined in NDTensors.jl to convert a block sparse array to the same block sparse array structure but with dense blocks (say of the blocks themselves were diagonal or sparse in some other way). I think this is useful, it should be defined in BlockSparseArrays.jl and based on a generic functiondense
in SparseArraysBase.jl that can generically convert a sparse array to dense while trying to preserve properties of the sparse array, like if it was allocated on GPU originally.ITensors.apply
/ITensors.product
for applying operator-like ITensors/tensor networks as linear maps. These should get reimplemented in terms of a more general index mapping system, since they rely heavily on prime levels and implicit conventions around those.ITensors.splitblocks
was used to split the blocks of a block sparse tensor in a certain dimension down to blocks of length 1, which is useful for making Hamiltonian tensor network operators more sparse. Define something similar in BlockSparseArrays.jl (maybe consider a different name but I can't think of a better one).ITensors.factorize
was used as a convenient abstraction layer on top of SVD, QR, eigen, etc. to select the best factorization based on the desired orthogonality and truncation. Bring something like that back in MatrixAlgebra.jl/TensorAlgebra.jl and wrap it in NamedDimsArrays.jl. An initial stand-in version to start getting ITensorMPS.jl functionality working was started in Implement more missing functionality #15.Move to a separate package
ITensors.ContractionSequenceOptimization
has an optimal contraction sequence finder, that should be split off into a more general package.ITensors.@debug_check
should be defined in a separate package, such as DebugChecks.jl (see also see https://github.com/Ferrite-FEM/Ferrite.jl/blob/v1.0.0/src/utils.jl, https://docs.julialang.org/en/v1/stdlib/Logging). For now its usage is being removed and we can bring back debug checks later.NDTensors.random_unitary
: bring back in MatrixAlgebra.jl/TensorAlgebra.jl, etc. There is a simplistic version in QuantomOperatorDefinitions.jl but the implementation should be moved out to a library where it can be shared.Bring back with a new name and deprecate the old name
ITensors.directsum
needs to be brought back. We may only need to defineBase.cat
forNamedDimsArrays.AbstractNamedDimsArray
where you can specify named dimensions to direct sum, but the interface is a bit different sodirectsum
could be a more convenient interface forBase.cat
. For now I'll start by keeping things minimal and just define a named dimension version ofBase.cat
, and see if we needdirectsum
, and if not we can deprecate it.ITensors.diag_itensor
should be brought back, but it should be renamed todiagonalarray
, i.e.diagonalarray(diag, i, j, k)
, which should be an overload of the function with the same name that we will define in DiagonalArrays.jl, see [ENHANCEMENT] Define a genericdiagonalarray
constructor DiagonalArrays.jl#8.ITensors.onehot
should get deprecated and renamed toSparseArraysBase.oneelement
, see [ENHANCEMENT] DefineSparseArraysBase.OneElement
/SparseArraysBase.oneelement
SparseArraysBase.jl#24. Started implementing in Add missing functionality like tagging and priming #4, Changeonehot
tooneelement
#26.NDTensors.enable_threaded_blocksparse
/NDTensors.disable_threaded_blocksparse
(also callable asITensors.f
). Probably this will be turned into a contraction backend ofBlockSparseArrays
, and will have a different interface.Maybe bring back
ITensors.Index(length::Int, tags::String)
has been removed for now, since tags were changed to a dictionary structure in Store tags in a dictionary #20 so it isn't clear what the string parsing should do. We could bring back tag sets for backwards compatibility but I wanted to assess the new design first and keep the constructors minimal.ITensors.hasind(a, i)
/ITensors.hasinds(a, is)
. I'm in favor of deprecating these in favor ofi ∈ inds(a)
andis ⊆ inds(a)
, we can remove them for now and bring them back as deprecations.ITensors.itensor
was a constructor that tried to make a view of the input data, whileITensor
was a copying constructor. For now I removed that distinction and am just usingITensor
. We can assess ifitensor
is needed, and either bring it back or deprecate it.ITensors.outer
/NDTensors.outer
was functionality for getting outer products of tensors. Either bring it back in TensorAlgebra.jl in some form, or get rid of it and just useTensorAlgebra.contract
(see [ENHANCEMENT] API for tensor/outer products TensorAlgebra.jl#17).ITensors.contract[!]
. Currently we just have*
, andcontract
is defined in TensorAlgebra.jl asTensorsAlgebra.contract
. Consider just using*
, andLinearAlgebra.mul!
/ArrayLayouts.muladd!
for in-place contraction of named dimension arrays. Also, decide what to do about non-binary/network contractions.NDTensors.enable_auto_fermion
,NDTensors.using_auto_fermion
, etc. need to be brought back in some form once we start reimplementing the automatic fermion sign system in the new code.ITensors.convert_leaf_eltype
to convert the scalar type of nested data structures. Find a better name or just useAdapt.adapt
for that functionality.Deprecate and delete
ITensors.TagSets.TagSet
/ITensors.TagSets.@ts_str
were used to parse string inputs like"a,b,c"
into sets of tags{"a", "b", "c"}
, either at runtime or compile time. With the new named tag/tag dict design introduced in Store tags in a dictionary #20 it isn't clear if we want or need that/what the syntax would be. For now we should remove uses of that and reassess bringing those back as needed.ITensors.OneITensor
should be deleted in favor of a scalar ITensor type. I think it is just internal so doesn't need to be deprecated, but we'll need to see where it is being used and change those parts of the code.ITensors.IndexSet
has been deprecated for a while but is still around for backwards compatibility, we should delete it. It could be replaced by an internal typedefstruct IndexCollection = Union{Vector{<:Index},Tuple{Vararg{Index}}}
.ITensors.Apply
is a lazy version ofITensors.apply
/ITensors.product
for applying operator-like ITensors/tensor networks as linear maps. Decide if we still want/need that and if not delete it or deprecate it.ITensors.order
/NDTensors.order
was a different name forBase.ndims
, I think we should deprecate it and just useBase.ndims
(same for the compile time versionITensors.Order
, which could beNDims
if needed).ITensors.permute(a, (j, i))
for permuting the ITensor data to a certain ordering. Deprecate in favor ofNamedDimsArrays.aligndims
.ITensors.dim
was the name for getting the length ofIndex
andITensor
, we should deprecate it and just useBase.length
andBase.to_dim
instead.ITensors.complex!
was an in-place version ofBase.complex
for ITensors. I think it can be deprecated/removed.The text was updated successfully, but these errors were encountered: