Skip to content

[TIFS 2023] Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild

License

Notifications You must be signed in to change notification settings

wenzhengzeng/blink_eyelid

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild [TIFS 2023]

Wenzheng Zeng1, Yang Xiao1†, GuileiHu1, Zhiguo Cao1, Sicheng Wei1, Zhiwen Fang2, Joey Tianyi Zhou3, Junsong Yuan4

1Huazhong University of Science and Technology, 2Southern Medical University, 3A*STAR, 4University at Buffalo

This repository contains the official implementation of the paper "Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild", which is accepted by IEEE Transactions on Information Forensics and Security (TIFS).

Introduction

Real-time eyeblink detection in the wild is a recently emerged challenging task that suffers from dramatic variations in face attribute, pose, illumination, camera view and distance, etc. One key issue is to well characterize eyelid’s intrinsic motion (i.e., approaching and departure between upper and lower eyelid) robustly, under unconstrained conditions. Towards this, a novel eyelid’s intrinsic motion-aware feature learning approach is proposed. Our proposition lies in 3 folds. First, the feature extractor is led to focus on informative eye region adaptively via introducing visual attention in a coarse-to-fine way, to guarantee robustness and fine-grained descriptive ability jointly. Then, 2 constraints are proposed to make feature learning be aware of eyelid’s intrinsic motion. Particularly, one concerns the fact that the inter-frame feature divergence within eyeblink processes should be greater than non-eyeblink ones to better reveal eyelid’s intrinsic motion. The other constraint minimizes the inter-frame feature divergence of non-eyeblink samples, to suppress motion clues due to head or camera movement, illumination change, etc. Meanwhile, concerning the high ambiguity between eyeblink and non-eyeblink samples, soft sample labels are acquired via self-knowledge distillation to conduct feature learning with finer supervision than the hard ones. The experiments verify that, our proposition is significantly superior to the state-of-the-art ones, and with real-time running efficiency. It is also of strong generalization capacity towards constrained conditions.

Installation

  1. Create a new conda environment:

    conda create -n blink_eyelid python=3.8
    conda activate blink_eyelid
  2. Install Pytorch (1.7.1 is recommended), opencv-python, tqdm, numpy, scipy.

Data preparation

Here we provide a pre-processed version of HUST-LEBW, which contains the face images detected by InsightFace. Remember to change the dataset root path into yours in 256.192.model/configs.py.

Inference

  • Run test.py for inference and evaluation. Remember to change the dataset path into yours.

    python 256.192.model/test.py

    Note: In the HUST-LEBW benchmark, "left eye" refers to the eye on the left side of the image, rather than the actual left eye. Similarly, "right eye" refers to the eye on the right side of the image, which is mirror-symmetric to our actual eye.

Citation

If you find our work useful in your research, please consider to cite our paper:

@ARTICLE{10207771,
  author={Zeng, Wenzheng and Xiao, Yang and Hu, Guilei and Cao, Zhiguo and Wei, Sicheng and Fang, Zhiwen and Zhou, Joey Tianyi and Yuan, Junsong},
  journal={IEEE Transactions on Information Forensics and Security}, 
  title={Eyelid’s Intrinsic Motion-Aware Feature Learning for Real-Time Eyeblink Detection in the Wild}, 
  year={2023},
  volume={18},
  number={},
  pages={5109-5121},
  doi={10.1109/TIFS.2023.3301728}}

About

[TIFS 2023] Eyelid’s Intrinsic Motion-aware Feature Learning for Real-time Eyeblink Detection in the Wild

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages