Skip to content

Try these ideas #30

@trevphil

Description

@trevphil
  • Use sampling (GTSAM)
  • Use tree-reweighted max product
  • With LBP, maybe it will converge to the correct marginals if you "observed" N internal bits of SHA-256. Of course, these aren't really observed, but if you brute force all 2^N possibilities and N is small and it converges, it could work [LBP probably won't work]
  • Use variational inference (this is not being done by GTSAM DiscreteBayesNet). Can use PyMC3.
  • Optimization techniques for high-dimensional variables: read here and here
  • Particle swarm optimization [too slow, didn't work]
  • "Observe" as much as you can from hash outputs, e.g. if the parent is SAME, INV, or AND and the result is 1 [done]
  • Reformulate AND gate as a set of switchable constraints in which only a subset should be active, and then apply robust estimation techniques like DCS and GNC
  • Try to use continuous random variables for 32-bit BitVectors and formulate continuous-domain equivalents for AND, INV, XOR, SHIFT, OR, ADD
  • Represent hash function as a neural network (maintaining AND and NOT operations), but incorporate skip connections such that nodes can be made aware of decisions upstream. Look into references at the bottom of this article.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions