Distinctive action sketch for human action recognition
| dc.contributor.author | Zheng, Ying | |
| dc.contributor.author | Yao, Hongxun | |
| dc.contributor.author | Sun, Xiaoshuai | |
| dc.contributor.author | Zhao, Sicheng | |
| dc.contributor.author | Porikli, Fatih | |
| dc.date.accessioned | 2024-05-08T05:22:08Z | |
| dc.date.issued | 2018 | |
| dc.date.updated | 2023-01-08T07:17:29Z | |
| dc.description.abstract | Recent developments in the field of computer vision have led to a renewed interest in sketch correlated research. There have emerged considerable solid evidence which revealed the significance of sketch. However, there have been few profound discussions on sketch based action analysis so far. In this paper, we propose an approach to discover the most distinctive sketches for action recognition. The action sketches should satisfy two characteristics: sketchability and objectiveness. Primitive sketches are prepared according to the structured forests based fast edge detection. Meanwhile, we take advantage of Faster R-CNN to detect the persons in parallel. On completion of the two stages, the process of distinctive action sketch mining is carried out. After that, we present four kinds of sketch pooling methods to get a uniform representation for action videos. The experimental results show that the proposed method achieves impressive performance against several compared methods on two public datasets. | en_AU |
| dc.description.sponsorship | The work was supported in part by the National Science Foundation of China (61472103, 61772158, 61702136, and 61701273) and Australian Research Council (ARC) grant (DP150104645). We especially would like to thank the China Scholarship Council (CSC) for funding the first author to conduct the partially of this project at Australian National University. | en_AU |
| dc.format.mimetype | application/pdf | en_AU |
| dc.identifier.issn | 0165-1684 | en_AU |
| dc.identifier.uri | http://hdl.handle.net/1885/317363 | |
| dc.language.iso | en_AU | en_AU |
| dc.publisher | Elsevier | en_AU |
| dc.relation | http://purl.org/au-research/grants/arc/DP150104645 | en_AU |
| dc.rights | © 2017 Elsevier B.V. | en_AU |
| dc.source | Signal Processing | en_AU |
| dc.subject | Action sketch | en_AU |
| dc.subject | Sketch pooling | en_AU |
| dc.subject | Action recognition | en_AU |
| dc.title | Distinctive action sketch for human action recognition | en_AU |
| dc.type | Journal article | en_AU |
| local.bibliographicCitation.lastpage | 332 | en_AU |
| local.bibliographicCitation.startpage | 323 | en_AU |
| local.contributor.affiliation | Zheng, Ying, College of Health and Medicine, ANU | en_AU |
| local.contributor.affiliation | Yao, Hongxun, Harbin University of Technology | en_AU |
| local.contributor.affiliation | Sun, Xiaoshuai, Harbin Institute of Technology | en_AU |
| local.contributor.affiliation | Zhao, Sicheng, Tsinghua University | en_AU |
| local.contributor.affiliation | Porikli, Fatih, College of Engineering, Computing and Cybernetics, ANU | en_AU |
| local.contributor.authoruid | Zheng, Ying, u6507469 | en_AU |
| local.contributor.authoruid | Porikli, Fatih, u5405232 | en_AU |
| local.description.embargo | 2099-12-31 | |
| local.description.notes | Imported from ARIES | en_AU |
| local.identifier.absfor | 400900 - Electronics, sensors and digital hardware | en_AU |
| local.identifier.ariespublication | u4351680xPUB345 | en_AU |
| local.identifier.citationvolume | 144 | en_AU |
| local.identifier.doi | 10.1016/j.sigpro.2017.10.022 | en_AU |
| local.identifier.scopusID | 2-s2.0-85032816063 | |
| local.identifier.thomsonID | WOS:000419412000034 | |
| local.publisher.url | https://www.elsevier.com/en-au | en_AU |
| local.type.status | Published Version | en_AU |
Downloads
Original bundle
1 - 1 of 1
Loading...
- Name:
- 1-s2.0-S016516841730381X-main.pdf
- Size:
- 1.81 MB
- Format:
- Adobe Portable Document Format
- Description: