Skip to content

Belief Propagation Example #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from
Closed

Belief Propagation Example #6

wants to merge 4 commits into from

Conversation

JoeyT1994
Copy link
Contributor

Request to merge some changes to main branch.

Changes:

  1. New folder added in examples which contains two files.
    i) BeliefPropagationFunctions.jl - general purpose functions for implementing belief propagation
    ii) ExampleBeliefPropagation.jl - Code for running a small example of using belief propagation to calculate local expectation
    values for a random ItensorNetwork
  2. Examples/peps/utils.jl: changed inner to dagger one of the ITensorNetworks and calculate the correct contraction

@@ -116,15 +116,15 @@ function ITensors.apply(o⃗::Vector{ITensor}, ψ::ITensorNetwork; cutoff, maxdi
end

function flattened_inner_network(ϕ::ITensorNetwork, ψ::ITensorNetwork)
tn = inner(ϕ, ψ)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think inner should already take the proper dag (it should already dag the first tensor network ϕ).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the dag isn't be handled correctly by inner than it should be fixed there.

function construct_initial_mts(psi::ITensorNetwork, g::NamedDimGraph, s::IndsNetwork)
#Make empty lists
forwardmts = ITensor[]
backwardmts = ITensor[]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about storing the message tensors in a Dictionary that maps the edges the message tensors are on to the message tensor? Then it is easy to look up the message tensors associated with a vertex by searching through the incident edges of the vertex.


#Normalise it
Mdat = matrix(M)
M = M / sqrt(tr(Mdat*Mdat'))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should be able to use normalize!(M) here.

@mtfishman
Copy link
Member

Glad to see how simple it is! We could probably just move examples/BeliefPropagation/BeliefPropagationFunctions.jl into the source file, say as src/belief_propagation/belief_propagation.jl. That would make it easier to use in other places.


println("One Dimensional Chain")
g = NamedDimGraph(Graphs.SimpleGraphs.grid([n]), [i for i = 1:n])
s = siteinds("S=1/2", g)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess we should allow passing a Graphs.SimpleGraph to siteinds if that isn't supported right now. Also we could make a default constructor NamedDimGraph(g::SimpleGraph) = NamedDimGraph(g, 1:nv(g)).

@JoeyT1994 JoeyT1994 closed this Sep 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants