Stars
一款适用于以HW行动/红队/渗透测试团队为场景的移动端(Android、iOS、WEB、H5、静态网站)信息收集扫描工具,可以帮助渗透测试工程师、攻击队成员、红队成员快速收集到移动端或者静态WEB站点中关键的资产信息并提供基本的信息输出,如:Title、Domain、CDN、指纹信息、状态信息等。
Source code for "Learning Graph Convolutional Network for Skeleton-based Human Action Recognition by Neural Searching", AAAI2020
Deep Learning Project.
The safety of senior citizens and children living alone at their residence have been a big concern for working individuals taking care of them. There is a need for a solution which can ensure 100% …
Contains code on comparing actions performed by human beings and scoring them
Multi-Stream Interaction Networks for Human Action Recognition
Code for : [Pattern Recognit. Lett. 2021] "Learn to cycle: Time-consistent feature discovery for action recognition" and [IJCNN 2021] "Multi-Temporal Convolutions for Human Action Recognition in Vi…
Multi-person Human Pose Estimation with HRNet in Pytorch
This repository implements the Breakfast Action Dataset in Pytorch and aims to achieve a high human action recognition using the C3D model
pytorch implementation of openpose including Hand and Body Pose Estimation.
A OpenMMLAB toolbox for human pose estimation, skeleton-based action recognition, and action synthesis.
Fast and accurate human pose estimation in PyTorch. Contains implementation of "Real-time 2D Multi-Person Pose Estimation on CPU: Lightweight OpenPose" paper.
Apply ML to the skeletons from OpenPose; 9 actions; multiple people. (WARNING: I'm sorry that this is only good for course demo, not for real world applications !!! Those ary very difficult !!!)
Code to reproduce experiments in 'LSTM-based real-time action detection and prediction in human motion streams'
A Simple But High-accuracy LSTM for human Action Recognition
Based on http://aqibsaeed.github.io/2016-11-04-human-activity-recognition-cnn/ and lstm
Deep learning and LSTM approaches for human activity recognition
Real-Time Spatio-Temporally Localized Activity Detection by Tracking Body Keypoints
Real-time, Multi-person & Multi-camera Fall Detector in Python