Skip to content

sanga327/Object_Detection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

10 Commits
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿฆœ Object Detection ๐Ÿฆœ

Object detection ๋ชจ๋ธ์„ ์ด์šฉํ•œ ๊ฐ์ฒด ํƒ์ง€ ๋ชจ๋ธ ๋งŒ๋“ค๊ธฐ

SSD(Single Shot Multibox Detector) ์‚ฌ์šฉ


1. ๋ฐ์ดํ„ฐ ์ˆ˜์ง‘

  • 11์ข…์˜ ์•ต๋ฌด์ƒˆ ๋ฐ์ดํ„ฐ(csv, img)

    • 11์ข…: ๊ณ ํ•€, ๋“€์ปต, ๋ฐฑ์ƒ‰์œ ํ™ฉ์•ต๋ฌด, ๋ถ‰์€๊ด€์œ ํ™ฉ์•ต๋ฌด, ํฐ์œ ํ™ฉ์•ต๋ฌด, ์Šค์นผ๋ ›๋งค์ปค์šฐ, ์ฒญ๊ธˆ๊ฐ•์•ต๋ฌด, ์นด๋ฉœ๋กฏ๋งค์ปค์šฐ, ํ™๊ธˆ๊ฐ•์•ต๋ฌด, ์˜ค์ƒ‰์•ต๋ฌด, ํšŒ์ƒ‰์•ต๋ฌด
    • ๊ฐ ์•ต๋ฌด์ƒˆ ์ข… ํด๋”์— csv, img ํด๋”๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ์Œ
  • img

    • 300x300 ์ด๋ฏธ์ง€
  • csv

frame xmin xmax ymin ymax class_id
0001_00000080.jpg 22 89 64 152 1
0001_00000047.jpg 40 124 46 160 1
0001_00000121.jpg 170 242 70 159 1
  • frame: image file name

  • xmin, xmax, ymin, ymax: ์•ต๋ฌด์ƒˆ ๋จธ๋ฆฌ ์ด๋ฏธ์ง€์˜ Anchor Box pixel ์œ„์น˜๊ฐ’

  • class_id: ์•ต๋ฌด์ƒˆ ์ข…


2. ํ•™์Šต ๋ฐ์ดํ„ฐ ์ „์ฒ˜๋ฆฌ

  • csv, img ํŒŒ์ผ์„ train, test, val๋กœ ๋ถ„๋ฆฌ

    for filename in glob.iglob('๋ชจ๋“ˆ8๋ฐ์ดํ„ฐ(SSD_์•ต๋ฌด์ƒˆ)/**/*.csv', recursive=True):
        csv = pd.read_csv(filename)
        if 'train' in filename:
            train_csv = train_csv.append(csv)
            for img in glob.iglob('๋ชจ๋“ˆ8๋ฐ์ดํ„ฐ(SSD_์•ต๋ฌด์ƒˆ)/**/*.jpg', recursive=True):
                if img.split("\\")[-1] in csv['frame'].values:
                    shutil.copy(img, "๋ชจ๋“ˆ8๋ฐ์ดํ„ฐ(SSD_์•ต๋ฌด์ƒˆ)/train/")
        if 'test' in filename:
            test_csv = test_csv.append(csv)
            for img in glob.iglob('๋ชจ๋“ˆ8๋ฐ์ดํ„ฐ(SSD_์•ต๋ฌด์ƒˆ)/**/*.jpg', recursive=True):
                if img.split("\\")[-1] in csv['frame'].values:
                    shutil.copy(img, "๋ชจ๋“ˆ8๋ฐ์ดํ„ฐ(SSD_์•ต๋ฌด์ƒˆ)/test/")
        if 'val' in filename:
            val_csv = val_csv.append(csv)
            for img in glob.iglob('๋ชจ๋“ˆ8๋ฐ์ดํ„ฐ(SSD_์•ต๋ฌด์ƒˆ)/**/*.jpg', recursive=True):
                if img.split("\\")[-1] in csv['frame'].values:
                    shutil.copy(img, "๋ชจ๋“ˆ8๋ฐ์ดํ„ฐ(SSD_์•ต๋ฌด์ƒˆ)/val/")
  • h5 file ์ƒ์„ฑ(test, val๋„ ๋™์ผํ•˜๊ฒŒ ์ง„ํ–‰)

    # 1: DataGenerator
    train_dataset = DataGenerator(load_images_into_memory=False, hdf5_dataset_path=None)
    
    # 2: Parse the image and label lists
    train_dataset.parse_csv(images_dir='data/train/',
                            labels_filename='data/train/train.csv',
                            input_format=['image_name', 'xmin', 'xmax', 'ymin', 'ymax', 'class_id'],
                            include_classes='all')
    
    train_dataset.create_hdf5_dataset(file_path='saved_model/dataset_train.h5',
                                      resize=False,
                                      variable_image_size=True,
                                      verbose=True)

3. ๋ชจ๋ธ ์„ ์ • ๋ฐ ํ•™์Šต

  • build model

    • model: ssd 300

    • optimizer: Adam

    K.clear_session()  # Clear previous models from memory.
    
    model = ssd_300(image_size=(img_height, img_width, img_channels),
                    n_classes=n_classes,
                    mode='training',
                    l2_regularization=0.0005,
                    scales=scales,
                    aspect_ratios_per_layer=aspect_ratios,
                    two_boxes_for_ar1=two_boxes_for_ar1,
                    steps=steps,
                    offsets=offsets,
                    clip_boxes=clip_boxes,
                    variances=variances,
                    normalize_coords=normalize_coords,
                    subtract_mean=mean_color,
                    swap_channels=swap_channels)
    
    weights_path = './saved_model/VGG_ILSVRC_16_layers_fc_reduced.h5'
    model.load_weights(weights_path, by_name=True)
    
    adam = Adam(lr=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0)
    ssd_loss = SSDLoss(neg_pos_ratio=3, alpha=1.0)
    
    model.compile(optimizer=adam, loss=ssd_loss.compute_loss)
  • training

    initial_epoch = 0
    final_epoch = 40
    steps_per_epoch = 100
    
    history = model.fit_generator(generator=train_generator,
                                  steps_per_epoch=steps_per_epoch,
                                  epochs=final_epoch,
                                  callbacks=callbacks,
                                  validation_data=val_generator,
                                  validation_steps=ceil(val_dataset_size/batch_size),
                                  initial_epoch=initial_epoch)

4. ๋ชจ๋ธ ๊ฐœ์„ 

  • optimizer: sgd -> adam

  • learning rate: 0.001 -> 0.0001

  • steps_per_epoch: 10 -> 100

  • batch_size, epoch


5. ๋ชจ๋ธ ํ‰๊ฐ€

epoch = 10๊นŒ์ง€์˜ ๊ฒฐ๊ณผ ๊ทธ๋ž˜ํ”„

loss๊ฐ’์ด ์ˆ˜๋ ดํ•˜๊ณ  ์žˆ์Œ์„ ์•Œ ์ˆ˜ ์žˆ๋‹ค.


6. ํ…Œ์ŠคํŠธ ๊ฒฐ๊ณผ

  • epoch = 3, loss = 7.912, val_loss = 6.8487

๋ถ‰์€๊ด€์œ ํ™ฉ์•ต๋ฌด๋ฅผ ๋ฐฑ์ƒ‰์œ ํ™ฉ์•ต๋ฌด๋กœ ์˜ˆ์ธกํ•˜๋Š” ๊ฒƒ์„ ๋ณด์•„, ์ œ๋Œ€๋กœ ์˜ˆ์ธก์„ ํ•˜์ง€ ๋ชปํ•˜๊ณ  ์žˆ์Œ์„ ์•Œ ์ˆ˜ ์žˆ๋‹ค.

  • epoch = 11, loss = 4.8239, val_loss = 4.2383

๋ถ‰์€๊ด€์œ ํ™ฉ์•ต๋ฌด๋ฅผ ๋ถ‰์€๊ด€์œ ํ™ฉ์•ต๋ฌด๋กœ, ์นด๋ฉœ๋กฏ๋งค์ปค์šฐ๋ฅผ ์นด๋ฉœ๋กฏ๋งค์ปค์šฐ๋กœ ๋ถ„๋ฅ˜ํ•˜๊ณ  ์žˆ๋Š” ๊ฒƒ์„ ๋ณด์•„ ์˜ˆ์ธก ์„ฑ๋Šฅ์ด ํ–ฅ์ƒ๋œ ๊ฒƒ์„ ์•Œ ์ˆ˜ ์žˆ๋‹ค.

Releases

No releases published

Packages

No packages published

Languages