@@ -30,7 +30,43 @@ The code structures looks like below:
30
30
- get_data_from_XML.py # parse Annotations of PASCAL VOC, helper of generator
31
31
32
32
```
33
-
33
+ ## Walk-through
34
+
35
+ The multibox loss is consist of ` L1 smooth loss ` and ` softmax ` loss. Let's see how they llok like
36
+
37
+ ` Arguments `
38
+ y_true: Ground truth bounding boxes,
39
+ tensor of shape (?, num_boxes, 4).
40
+ y_pred: Predicted bounding boxes,
41
+ tensor of shape (?, num_boxes, 4).
42
+ ` Returns `
43
+ l1_loss: L1-smooth loss, tensor of shape (?, num_boxes).
44
+ ` References ` - https://arxiv.org/abs/1504.08083
45
+
46
+ ``` python
47
+ def _l1_smooth_loss (self , y_true , y_pred ):
48
+ abs_loss = tf.abs(y_true - y_pred)
49
+ sq_loss = 0.5 * (y_true - y_pred)** 2
50
+ l1_loss = tf.where(tf.less(abs_loss, 1.0 ), sq_loss, abs_loss - 0.5 )
51
+ return tf.reduce_sum(l1_loss, - 1 )
52
+ ```
53
+ Now let's walk through the ` softmax ` loss
54
+
55
+ ` Arguments `
56
+ y_true: Ground truth targets,
57
+ tensor of shape (?, num_boxes, num_classes).
58
+ y_pred: Predicted logits,
59
+ tensor of shape (?, num_boxes, num_classes).
60
+ ` Returns `
61
+ softmax_loss: Softmax loss, tensor of shape (?, num_boxes).
62
+
63
+ ``` python
64
+ def _softmax_loss (self , y_true , y_pred ):
65
+ y_pred = tf.maximum(tf.minimum(y_pred, 1 - 1e-15 ), 1e-15 )
66
+ softmax_loss = - tf.reduce_sum(y_true * tf.log(y_pred),
67
+ axis = - 1 )
68
+ return softmax_loss
69
+ ```
34
70
## Resources
35
71
36
72
dataset can be downloaded from [ http://host.robots.ox.ac.uk/pascal/VOC/ , use The VOC2007 Challenge in this example
0 commit comments