Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Additions needed for sweeping algorithms on trees #18

Merged
merged 20 commits into from
Jan 9, 2023

Conversation

leburgel
Copy link
Contributor

@leburgel leburgel commented Nov 4, 2022

Work in progress, this adds some functionalities for tree tensor networks which are needed to be able to run sweeping algorithms on them. It's quite a lot, so in addition to raw code review I'll try to summarize the main additions below, as well as start a list of things I thought might be good to discuss while coding (some of which are not very relevant and should just be filed as issues...).

Summary of additions:

  • ITensorNetwork:
    • Typeonversion
    • ITensors.replaceinds:
      • modified to check for equality of underlying graphs, and to take into account unassigned vertices or edges
    • Priming and tagging:
      • added linkinds and siteinds filters similar to the ITensors.MPS case
    • Truncation
    • linkdims
    • Common site index checking
    • Site index combiners
  • AbstractTreeTensorNetwork
    • Some additional constructors
      • Constructors starting from dense tensors
      • Random and product state constructors
    • Orthogonalization, contraction and truncation
    • Norm, lognorm and normalization
    • Some scalar arithmetic
    • Addition (direct sum algorithm)
  • TreeTensorNetworkOperator (new type)
    • Automatic generation from ITensors.OpSum
    • Expectation value and variance (inner)
  • ProjTTNO, ProjTTNOApply and ProjTTNOSum (new types)

leburgel and others added 7 commits October 31, 2022 11:38
Remove tree graph paths (merged into NamedGraphs.jl)
Extend `models.jl` and remove `test/utils.jl`

Change lazy `inner` to explicit `inner_network`
Add support for choosing a subset of sites in `expect`, similar to MPS implementation

Condition expectation value type on `leaf_eltype` of state
@leburgel leburgel marked this pull request as draft November 4, 2022 22:03
src/itensornetwork.jl Outdated Show resolved Hide resolved
@emstoudenmire
Copy link
Contributor

@mtfishman @leburgel : More of a minor comment, but should the TreeTensorNetworkOperator and TreeTensorNetworkState types just be named TTNO and TTNS ? I see that these are provided as aliases for the above, but then I think most people, including developers, would prefer to always use the abbreviations. Let me know if there's a more technical reason for needing the longer names.

@mtfishman
Copy link
Member

@mtfishman @leburgel : More of a minor comment, but should the TreeTensorNetworkOperator and TreeTensorNetworkState types just be named TTNO and TTNS ? I see that these are provided as aliases for the above, but then I think most people, including developers, would prefer to always use the abbreviations. Let me know if there's a more technical reason for needing the longer names.

That choice is mostly due to my bias towards being more explicit and avoiding abbreviations as much as possible. I don't see the disadvantage of providing the aliases which people can use if they want (and we can use internally if we decide that style is preferred).

@leburgel
Copy link
Contributor Author

@mtfishman @leburgel : More of a minor comment, but should the TreeTensorNetworkOperator and TreeTensorNetworkState types just be named TTNO and TTNS ? I see that these are provided as aliases for the above, but then I think most people, including developers, would prefer to always use the abbreviations. Let me know if there's a more technical reason for needing the longer names.

That choice is mostly due to my bias towards being more explicit and avoiding abbreviations as much as possible. I don't see the disadvantage of providing the aliases which people can use if they want (and we can use internally if we decide that style is preferred).

I stuck to explicit type names and short aliases since the choice had already been made for TreeTensorNetworkState, but I haven't really paid attention to consistently use either after that. I think I mostly used the shorthands later on in the code, but this was not really purposeful and can be changed if desirable.

mutable struct ProjTTNO <: AbstractProjTTNO
pos::Union{Vector{<:Tuple},NamedDimEdge{Tuple}} # TODO: cleanest way to specify effective Hamiltonian position?
H::TTNO
environments::Dictionary{NamedDimEdge{Tuple},ITensor}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It may be nice to make this a directed DataGraph.

Copy link
Contributor Author

@leburgel leburgel Dec 21, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be a good idea, adding/invalidating environments by adding/removing directed edges sounds nice. I would have to think about what to do with the inside the operator position for a 2-site effective operator (remove the corresponding edges altogether?) and whether I could get rid of some other things by using more DataGraph-like methods instead.

"""
ProjTTNOSum
"""
mutable struct ProjTTNOSum
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, I think this should not be mutable.

I was also thinking about using the Applied system in ITensors.LazyApply to avoid having to actually define types like this. That would be a more general way to represent lazy combinations of projected TTNs (i.e. help with representing more general combinations, like sums of projected TTNS and TTNO, sums of TTNOs applied to TTNS, etc. without having to hardcode specific types).

Maybe we could chat about how that might work, this would be a good opportunity to rethink some of the type designs here (some of these designs are holdovers from porting the C++ version).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This definitely sounds like a nice idea.

Merge updates for new `NamedGraphs.jl`/`DataGraphs.jl` interface

Remove unnecessary `ITensors.convert_leaf_eltype` calls in favor of `ITensors.convert_eltype`

Add `@test_broken` for `swapprime(::AbstractITensorNetwork, ...)`

Use to `Graphs.degree` in `OpSum` -> `TTNO` converter

Remove `siteinds(all/only, ...)` and `linkinds(all/only, ...)` syntax
@mtfishman
Copy link
Member

@leburgel you bring up a lot of good questions and points in your first post but it is tough to try to discuss and resolve them from this PR. Could you split up questions that haven't been resolved yet into separate issues?

@leburgel
Copy link
Contributor Author

@leburgel you bring up a lot of good questions and points in your first post but it is tough to try to discuss and resolve them from this PR. Could you split up questions that haven't been resolved yet into separate issues?

Yes, I'll drop the things that are no longer relevant and move the rest to new issues.

return collect(Base.Iterators.flatten(edges))
end

function internal_edges(P::AbstractProjTTNO{V})::Vector{NamedEdge{V}} where {V}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good idea for a generic graph function. It is the complement of the edge boundary of a subgraph (https://en.wikipedia.org/wiki/Boundary_(graph_theory) and boundary_edges added in ITensor/NamedGraphs.jl#20).

As we discussed, a simple implementation would be internal_edges(g::AbstractGraph, vertices) = edges(subgraph(g, vertices)). Then this function could be defined as internal_edges(P::AbstractProjTTNO) = internal_edges(underlying_graph(P), sites(P)).

@leburgel leburgel marked this pull request as ready for review January 9, 2023 18:41
@codecov-commenter
Copy link

codecov-commenter commented Jan 9, 2023

Codecov Report

Merging #18 (e9f0c7a) into main (7c98d32) will decrease coverage by 14.81%.
The diff coverage is 47.11%.

@@             Coverage Diff             @@
##             main      #18       +/-   ##
===========================================
- Coverage   80.17%   65.36%   -14.82%     
===========================================
  Files          29       37        +8     
  Lines        1226     2105      +879     
===========================================
+ Hits          983     1376      +393     
- Misses        243      729      +486     
Impacted Files Coverage Δ
src/ITensorNetworks.jl 77.77% <ø> (ø)
src/treetensornetwork/abstractprojttno.jl 0.00% <0.00%> (ø)
src/treetensornetwork/projttno.jl 0.00% <0.00%> (ø)
src/treetensornetwork/projttno_apply.jl 0.00% <0.00%> (ø)
src/treetensornetwork/projttnosum.jl 0.00% <0.00%> (ø)
src/treetensornetwork/ttno.jl 18.57% <18.57%> (ø)
src/abstractitensornetwork.jl 76.48% <23.17%> (-16.22%) ⬇️
src/treetensornetwork/ttns.jl 30.61% <30.61%> (ø)
src/itensornetwork.jl 85.34% <33.33%> (-2.38%) ⬇️
src/treetensornetwork/abstracttreetensornetwork.jl 36.06% <36.06%> (ø)
... and 9 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@mtfishman mtfishman merged commit 0d64da8 into ITensor:main Jan 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants