Skip to content

0xekez/tinyLIRPA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A pedagogical implementation of the ⍺β-CROWN algorithm for neural network verification for a deep-learning library called tinygrad.

For example, after running the program below l and u contain proven lower and upper bounds on the output of f within a radius-0.1 L₂ ball around x.

from tinygrad import Tensor
from tinylirpa import beta_crown

x = Tensor.randn(3)
w = Tensor.randn(3,3)
f = lambda x: x.matmul(w).relu()
l, u = beta_crown(f(x), x, 0.1, 2, 10, 0)

The 10 parameter specifies how many steps of gradient descent to run to optimize the bounds and the 0 parameter specifies how many neuron splits to consider in the β-CROWN algorithm. See report.pdf for an explanation of how this works.

About

⍺β-CROWN for tinygrad

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages