Unsupervised pre-training for fully convolutional neural networks

dc.contributor.authorWiehman, Stiaanen_ZA
dc.contributor.authorKroon, Steveen_ZA
dc.contributor.authorDe Villiers, Hendriken_ZA
dc.date.accessioned2017-01-23T13:50:17Z
dc.date.available2017-01-23T13:50:17Z
dc.date.issued2016
dc.descriptionCITATION: Wiehman, S., Kroon, S. & De Villiers, H. 2016. Unsupervised pre-training for fully convolutional neural networks. Pattern Recognition Association of South Africa and Robotics and Mechatronics International Conference (PRASA-RobMech), 30 November-2 December 2016, Stellenbosch, South Africa.
dc.descriptionThe original publication is available at http://ieeexplore.ieee.org
dc.description.abstractUnsupervised pre-training of neural networks has been shown to act as a regularization technique, improving performance and reducing model variance. Recently, fully con-volutional networks (FCNs) have shown state-of-the-art results on various semantic segmentation tasks. Unfortunately, there is no efficient approach available for FCNs to benefit from unsupervised pre-training. Given the unique property of FCNs to output segmentation maps, we explore a novel variation of unsupervised pre-training specifically designed for FCNs. We extend an existing FCN, called U-net, to facilitate end-to-end unsupervised pre-training and apply it on the ISBI 2012 EM segmentation challenge data set. We performed a battery of significance tests for both equality of means and equality of variance, and show that our results are consistent with previous work on unsupervised pre-training obtained from much smaller networks. We conclude that end-to-end unsupervised pre-training for FCNs adds robustness to random initialization, thus reducing model variance.en_ZA
dc.description.urihttp://ieeexplore.ieee.org/document/7813160/
dc.description.versionPost print
dc.format.extent6 pages ; illustrationsen_ZA
dc.identifier.citationWiehman, S., Kroon, S. & De Villiers, H. 2016. Unsupervised pre-training for fully convolutional neural networks. Pattern Recognition Association of South Africa and Robotics and Mechatronics International Conference (PRASA-RobMech), 30 November-2 December 2016, Stellenbosch, South Africa
dc.identifier.isbn978-1-5090-3335-5 (online)
dc.identifier.isbn978-1-5090-3336-2 (print)
dc.identifier.otherdoi:10.1109/RoboMech.2016.7813160
dc.identifier.urihttp://hdl.handle.net/10019.1/100503
dc.language.isoen_ZAen_ZA
dc.publisherInstitute of Electrical and Electronics Engineersen_ZA
dc.rights.holderInstitute of Electrical and Electronics Engineersen_ZA
dc.subjectNeural networks (Computer science)en_ZA
dc.subjectConvolutions (Mathematics)en_ZA
dc.subjectMap segmentation -- Semanticsen_ZA
dc.titleUnsupervised pre-training for fully convolutional neural networksen_ZA
dc.typeConference Paperen_ZA
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
wiehman_unsupervised_2016.pdf
Size:
1.85 MB
Format:
Adobe Portable Document Format
Description:
Download article
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.95 KB
Format:
Item-specific license agreed upon to submission
Description: