A Unified Neural Network for Panoptic Segmentation

dc.contributor.authorYao, Lien_US
dc.contributor.authorChyau, Angen_US
dc.contributor.editorLee, Jehee and Theobalt, Christian and Wetzstein, Gordonen_US
dc.date.accessioned2019-10-14T05:08:57Z
dc.date.available2019-10-14T05:08:57Z
dc.date.issued2019
dc.description.abstractIn this paper, we propose a unified neural network for panoptic segmentation, a task aiming to achieve more fine-grained segmentation. Following existing methods combining semantic and instance segmentation, our method relies on a triple-branch neural network for tackling the unifying work. In the first stage, we adopt a ResNet50 with a feature pyramid network (FPN) as shared backbone to extract features. Then each branch leverages the shared feature maps and serves as the stuff, things, or mask branch. Lastly, the outputs are fused following a well-designed strategy. Extensive experimental results on MS-COCO dataset demonstrate that our approach achieves a competitive Panoptic Quality (PQ) metric score with the state of the art.en_US
dc.description.number7
dc.description.sectionheadersImages and Learning
dc.description.seriesinformationComputer Graphics Forum
dc.description.volume38
dc.identifier.doi10.1111/cgf.13852
dc.identifier.issn1467-8659
dc.identifier.pages461-468
dc.identifier.urihttps://doi.org/10.1111/cgf.13852
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf13852
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectComputing methodologies
dc.subjectImage segmentation
dc.subjectNeural networks
dc.titleA Unified Neural Network for Panoptic Segmentationen_US
Files
Collections