Replies: 1 comment
-
I forgot about this discussion, this is pretty outdated and I think mostly implemented so I'll close it, we can open a new discussion if needed. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
To-do list for splitting off
NDTensors.BlockSparseArrays
as a separate registered packageBlockSparseArrays.jl
:AnyAbstractBlockSparseArray
type union in favor of an@derive
macro, similar toMoshi.@derive
, the Rust derive attribute for implementing traits, andArrayLayouts.@layoutmatrix
and related macros inArrayLayouts.jl
. This would basically automatically definegetindex
,map!
, etc. asblocksparse_getindex
,blocksparse_map!
, etc. on a specified type or wrapper.NDTensors.jl
sub-modulesBlockSparseArrays
depends on and either remove those dependencies or assess what we need to do to split off those libraries into packages as well. For example:SparseArraysBase
is a major dependency so we will have to release that first, see `SparseArraysBase.jl` release to-do list SparseArraysBase.jl#1.BroadcastMapConversion
, which converts broadcast calls to map calls (which is heavily inspired by the broadcasting code logic inStrided.jl
). That library is also used in other sub-modules ofNDTensors.jl
, such asBlockSparseArrays
andNamedDimsArrays
.GradedAxes
is being used for things likedual
and to provide some generic block axis slicing functionality that works for both graded and non-graded unit ranges.TypeParameterAccessors
for generically accessing type parameters, in particular it uses functionality for generically getting the type of the parent of a wrapper type. We've been planning to split that off for a while, though I think there are still some type instability issues and interface questions to decide on so I'm not sure how comfortable I am doing that right now.NestedPermutedDimsArrays
will be used as the output ofblocks(::PermutedDimsArray)
.GradedAxes
andTensorAlgebra
for compatibility with those libraries.BlockSparseArrays
, particularly in light of any changes we decide to make toSparseArraysBase
, which are being discussed in `SparseArraysBase.jl` release to-do list SparseArraysBase.jl#1.Beta Was this translation helpful? Give feedback.
All reactions