Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENHANCEMENT] Functionality to bring back or deprecate #5

Open
28 tasks
mtfishman opened this issue Jan 16, 2025 · 0 comments
Open
28 tasks

[ENHANCEMENT] Functionality to bring back or deprecate #5

mtfishman opened this issue Jan 16, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@mtfishman
Copy link
Member

mtfishman commented Jan 16, 2025

This is an issue for tracking functionality that has been removed, renamed, or changed in the rewrite, which should be brought back and/or properly deprecated (with deprecation warnings or errors).

Definitely bring back

  • ITensors.delta/ITensors.δ constructors for constructing diagonal ITensors with uniform diagonal values of 1. They should be overloads/wrappers around the same functions in DiagonalArrays.jl, see Delta tensors DiagonalArrays.jl#6.
  • NDTensors.BackendSelection.Algorithm/NDTensors.BackendSelection.@Algorithm_str should be removed from here and moved to a new package BackendSelection.jl.
  • NDTensors.denseblocks was defined in NDTensors.jl to convert a block sparse array to the same block sparse array structure but with dense blocks (say of the blocks themselves were diagonal or sparse in some other way). I think this is useful, it should be defined in BlockSparseArrays.jl and based on a generic function dense in SparseArraysBase.jl that can generically convert a sparse array to dense while trying to preserve properties of the sparse array, like if it was allocated on GPU originally.
  • ITensors.apply/ITensors.product for applying operator-like ITensors/tensor networks as linear maps. These should get reimplemented in terms of a more general index mapping system, since they rely heavily on prime levels and implicit conventions around those.
  • ITensors.splitblocks was used to split the blocks of a block sparse tensor in a certain dimension down to blocks of length 1, which is useful for making Hamiltonian tensor network operators more sparse. Define something similar in BlockSparseArrays.jl (maybe consider a different name but I can't think of a better one).
  • ITensors.factorize was used as a convenient abstraction layer on top of SVD, QR, eigen, etc. to select the best factorization based on the desired orthogonality and truncation. Bring something like that back in MatrixAlgebra.jl/TensorAlgebra.jl and wrap it in NamedDimsArrays.jl. An initial stand-in version to start getting ITensorMPS.jl functionality working was started in Implement more missing functionality #15.

Move to a separate package

Bring back with a new name and deprecate the old name

Maybe bring back

  • ITensors.Index(length::Int, tags::String) has been removed for now, since tags were changed to a dictionary structure in Store tags in a dictionary #20 so it isn't clear what the string parsing should do. We could bring back tag sets for backwards compatibility but I wanted to assess the new design first and keep the constructors minimal.
  • ITensors.hasind(a, i)/ITensors.hasinds(a, is). I'm in favor of deprecating these in favor of i ∈ inds(a) and is ⊆ inds(a), we can remove them for now and bring them back as deprecations.
  • ITensors.itensor was a constructor that tried to make a view of the input data, while ITensor was a copying constructor. For now I removed that distinction and am just using ITensor. We can assess if itensor is needed, and either bring it back or deprecate it.
  • ITensors.outer/NDTensors.outer was functionality for getting outer products of tensors. Either bring it back in TensorAlgebra.jl in some form, or get rid of it and just use TensorAlgebra.contract (see [ENHANCEMENT] API for tensor/outer products TensorAlgebra.jl#17).
  • ITensors.contract[!]. Currently we just have *, and contract is defined in TensorAlgebra.jl as TensorsAlgebra.contract. Consider just using *, and LinearAlgebra.mul!/ArrayLayouts.muladd! for in-place contraction of named dimension arrays. Also, decide what to do about non-binary/network contractions.
  • NDTensors.enable_auto_fermion, NDTensors.using_auto_fermion, etc. need to be brought back in some form once we start reimplementing the automatic fermion sign system in the new code.
  • ITensors.convert_leaf_eltype to convert the scalar type of nested data structures. Find a better name or just use Adapt.adapt for that functionality.

Deprecate and delete

  • ITensors.TagSets.TagSet/ITensors.TagSets.@ts_str were used to parse string inputs like "a,b,c" into sets of tags {"a", "b", "c"}, either at runtime or compile time. With the new named tag/tag dict design introduced in Store tags in a dictionary #20 it isn't clear if we want or need that/what the syntax would be. For now we should remove uses of that and reassess bringing those back as needed.
  • ITensors.OneITensor should be deleted in favor of a scalar ITensor type. I think it is just internal so doesn't need to be deprecated, but we'll need to see where it is being used and change those parts of the code.
  • ITensors.IndexSet has been deprecated for a while but is still around for backwards compatibility, we should delete it. It could be replaced by an internal typedef struct IndexCollection = Union{Vector{<:Index},Tuple{Vararg{Index}}}.
  • ITensors.Apply is a lazy version of ITensors.apply/ITensors.product for applying operator-like ITensors/tensor networks as linear maps. Decide if we still want/need that and if not delete it or deprecate it.
  • ITensors.order/NDTensors.order was a different name for Base.ndims, I think we should deprecate it and just use Base.ndims (same for the compile time version ITensors.Order, which could be NDims if needed).
  • ITensors.permute(a, (j, i)) for permuting the ITensor data to a certain ordering. Deprecate in favor of NamedDimsArrays.aligndims.
  • ITensors.dim was the name for getting the length of Index and ITensor, we should deprecate it and just use Base.length and Base.to_dim instead.
  • ITensors.complex! was an in-place version of Base.complex for ITensors. I think it can be deprecated/removed.
@mtfishman mtfishman added the enhancement New feature or request label Jan 16, 2025
@mtfishman mtfishman changed the title [ENHANCEMENT] Define onehot [ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jl` Jan 16, 2025
@mtfishman mtfishman changed the title [ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jl` [ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jl Jan 16, 2025
@mtfishman mtfishman changed the title [ENHANCEMENT] Missing ITensor features needed for ITensorMPS.jl/ITensorNetworks.jl [ENHANCEMENT] Functionality to bring back or deprecate Jan 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant