°Ù³µÈ«ËµÂÛ̳

°Ù³µÈ«ËµÂÛ̳ > »ùÓÚ¶¯Ì¬±íʾºÍ¾ö²ßÃŵÄÉî¶ÈÉñ¾­ÍøÂçÓÐÐ§ÍÆÀí£üºñÊÆÆû³µ

»ùÓÚ¶¯Ì¬±íʾºÍ¾ö²ßÃŵÄÉî¶ÈÉñ¾­ÍøÂçÓÐÐ§ÍÆÀí£üºñÊÆÆû³µ

×÷Õß:ºñÊÆ

×÷ÕߣºMohammad Saeed Shafiee*

±àÒ룺ͬ¼ÃÖÇÄÜÆû³µÑо¿Ëù

±à¼­£ºà¨Ð¡Ñ©

ÈëȺ£º¼Ó΢ÐźÅautoHS£¬ÈëºñÊÆÆû³µ¿Æ¼¼ÈºÓëÐÐҵר¼ÒÌÖÂÛ¸ü¶àÐÅÏ¢

ÕªÒª£ºÔÚÉñ¾­ÍøÂçµÄÉî¶ÈºÍÆä¼ÆËã³É±¾Ö®¼äÄÑÒÔȨºâµÄÎÊÌâʹµÃµ±Ç°ºÜÄѽ«Éî¶ÈÉñ¾­ÍøÂçÓ¦ÓÃÓÚÐí¶à¹¤ÒµÓ¦Óã¬ÓÈÆäÊÇÔÚ¼ÆËãÄÜÁ¦ÓÐÏÞµÄÇé¿öÏ¡£ÔÚÕâÆªÎÄÕÂÖУ¬ÎÒÃÇÊܵ½ÕâÑùµÄÏë·¨Æô·¢£ºËäÈ»ÐèÒª¸üÉîµÄǶÈëÀ´Çø·Ö¸´ÔÓµÄÑù±¾£¬µ«ÊÇͨ¹ý½ÏdzµÄǶÈë¿ÉÒԺܺõØÇø·Ö´óÅúÁ¿µÄÑù±¾¡£ÔÚ±¾Ñо¿ÖУ¬ÎÒÃǽéÉÜÁ˾ö²ßÃÅ£¨d-gate£©µÄ¸ÅÄѵÁ·ÕâЩģ¿éÀ´¾ö¶¨ÊÇ·ñÐèÒª½«Ñù±¾Í¶Ó°µ½¸üÉîµÄǶÈëÖУ¬»òÕßÊÇ·ñ¿ÉÒÔÔÚ¾ö²ßÃŽøÐÐÔçÆÚÔ¤²â£¬´Ó¶øÄܹ»¼ÆË㲻ͬÉî¶ÈµÄ¶¯Ì¬±íʾ¡£ËùÌá³öµÄ¾ö²ßÃÅÄ£¿é¿ÉÒÔÓëÈκÎÉî¶ÈÉñ¾­ÍøÂ缯³É£¬²¢ÇÒÔÚ±£³Ö½¨Ä£¾«¶ÈµÄͬʱ½µµÍÉî¶ÈÉñ¾­ÍøÂçµÄƽ¾ù¼ÆËã³É±¾¡£ÊµÑé½á¹û±íÃ÷£¬ÔÚ CIFAR10 Êý¾Ý¼¯ÉÏѵÁ·ºó£¬ÀûÓÃÌá³öµÄ¾ö²ßÃÅÄ£¿é£¬ResNet-101 ¼ÓËÙÔ¼ 38£¥£¬FLOPS ¼õÉÙÔ¼ 39£¥£¬DenseNet-201 ¼ÓËÙÔ¼ 46£¥£¬FLOPS ¼õÉÙÔ¼ 36£¥£¬¾«¶È½öϽµÔ¼ 2£¥¡£

Ò»¡¢½éÉÜ

֮ǰµÄÑо¿[16]ÒѾ­±íÃ÷£¬¸üÉî²ãµÄÍøÂç¼Ü¹¹Í¨³£»á´øÀ´¸üºÃµÄ½¨Ä£ÐÔÄÜ£»È»¶ø£¬¸üÉî²ãµÄÍøÂç¼Ü¹¹Ò²»á´øÀ´Ò»Ð©ÎÊÌâ¡£³ýÁ˱äµÃ¸üÈÝÒ×¹ýÄâºÏºÍ¸üÄÑѵÁ·Ö®Í⣬Éî¶ÈºÍ¼ÆËã³É±¾Ö®¼äµÄȨºâʹµÃÐí¶à¹¤ÒµÓ¦ÓúÜÄѲÉÓøüÉîµÄ¼Ü¹¹¡£

He µÈÈË[7]ͨ¹ýÒýÈë²Ð²îѧϰµÄ¸ÅÄ½â¾öÁËÉî²ãÉñ¾­ÍøÂçѧϰÖеÄÍË»¯ÎÊÌ⣨ÀýÈçÌݶÈÏûʧ£©£¬ÆäÖÐѧϰÊÇ»ùÓڲвîÓ³É䣬¶ø²»ÊÇÖ±½Ó»ùÓÚδ²ÎÕÕÓ³Éä¡£½ôËæÆäºóÊÇ£¬Xie µÈÈË[19]ÀûÓòвî¿é½á¹¹Öеijõʼ˼Ï루¼´·ÖÁÑ-±ä»»-ºÏ²¢²ßÂÔ£©À´Ìṩ¸üºÃµÄ×ӿռ佨ģ£¬Í¬Ê±½â¾öÍË»¯ÎÊÌ⣬´Ó¶øµÃµ½¾ßÓиĽø½¨Ä£¾«¶ÈµÄ ResNext Ìåϵ½á¹¹¡£ÎªÁ˽â¾ö¼ÆËã³É±¾ÎÊÌ⣬ÈËÃÇÌá³öÁ˸÷ÖÖ¸÷ÑùµÄ·½·¨£¬°üÀ¨£º¾«¶È½µµÍ[10]¡¢Ä£ÐÍѹËõ[6]¡¢Ê¦Éú²ßÂÔ[8]ºÍ½ø»¯Ëã·¨[13£¬14]¡£

×î½ü£¬Ìõ¼þ¼ÆËã[1,4,12,18,2]ºÍÔçÆÚÔ¤²â[17]·½·¨ÒѾ­±»Ìá³öÀ´´¦ÀíÕâ¸öÎÊÌ⣬ÕâЩ·½·¨Éæ¼°ÍøÂçÄÚ²»Í¬Ä£¿éµÄ¶¯Ì¬Ö´ÐС£Ìõ¼þ¼ÆËã·½·¨Ôںܴó³Ì¶ÈÉÏÊܵ½ÒÔÏÂ˼ÏëµÄÍÆ¶¯£º²ÐÓàÍøÂç¿É±»ÊÓΪ½ÏÇ³ÍøÂçµÄ¼¯ºÏ¡£Òò´Ë£¬ÕâЩ·½·¨ÀûÓÃÌøÔ¾Á¬½ÓÀ´È·¶¨ÄÄЩʣÓàÄ£¿éÐèÒªÖ´ÐУ¬ÆäÖдó¶àÊýÀûÓÃÁËÔöǿѧϰ¡£

ÔÚ±¾Ñо¿ÖУ¬ÎÒÃÇÖ÷Ҫ̽¾¿ÔçÆÚÔ¤²âÕâ¸öÏë·¨£¬µ«È¡¶ø´úÖ®µÄÊÇ´ÓÈí±ß¼ÊÖ§³ÖÏòÁ¿»ú[3]ÀíÂÛÖеõ½¾ö²ßÆôʾ¡£ÌØ±ðµØ£¬ÎÒÃÇÒýÈë¾ö²ßÃŵĸÅÄѵÁ·Ä£¿éÒÔ¾ö¶¨ÊÇ·ñÐèÒª½«Ñù±¾Í¶Ó°µ½¸üÉîµÄǶÈëÖУ¬»òÊÇ·ñ¿ÉÒÔÔÚ¾ö²ßÃÅ´¦½øÐÐÔçÆÚÔ¤²â£¬´Ó¶øÄܹ»ÔÚ²»Í¬Éî¶ÈÉϽøÐж¯Ì¬±íʾµÄÌõ¼þ¼ÆËã¡£ËùÌá³öµÄ¾ö²ßÃÅÄ£¿é¿ÉÒÔÓëÈκÎÉî²ãÉñ¾­ÍøÂ缯³É£¬¶ø²»ÐèÒª´ÓͷѵÁ·ÍøÂ磬´Ó¶øÔÚ±£³ÖÄ£Ð;«¶ÈµÄͬʱ½µµÍÁËÉî²ãÉñ¾­ÍøÂçµÄƽ¾ù¼ÆË㸴ÔÓ¶È¡£

ͼ1 ¾ö²ßÃű»Ö±½Ó¼¯³Éµ½Éî²ãÉñ¾­ÍøÂçÖУ¬²¢ÇÒ±»ÑµÁ·À´Ô¤²â¾ö²ßÊÇÔÚ¾ö²ßÃÅ´¦×ö³ö»¹ÊÇÐèҪͶӰµ½Éî²ãǶÈëÖС£

¶þ¡¢·½·¨ÂÛ

Óëdz²ã½á¹¹Ïà±È£¬Éî²ãÉñ¾­ÍøÂç½á¹¹Äܹ»Ìṩ¸üºÃµÄÊý¾Ý×Ó¿Õ¼äǶÈ룬´Ó¶øÄܹ»¸üºÃµØÇø·ÖÊý¾Ý¿Õ¼ä£¬½ø¶øµÃµ½¸üºÃµÄ½¨Ä£¾«¶È¡£ÊÜÈí±ß¼ÊÖ§³ÖÏòÁ¿»ú[3]ÀíÂ󵀮ô·¢£¬ÎÒÃÇÌá³öÁËÒ»¸ö¼ÙÉ裬¾¡¹Ü¶ÔÓÚÔڽϵ͵ÄÍøÂç²ãµÄ²¢ÇÒλÓÚ¾ö²ß±ß½çµÄÉϵÄÑù±¾Ê¹ÓøüÉîµÄǶÈëʽ×Ó¿Õ¼äÊDZØÒªµÄ£¬µ«ÊÇËûÃÇʵ¼ÊÉ϶ÔÓÚÄÇЩÔÚdz²ãǶÈëʽ¿Õ¼ä²¢ÇÒÒѾ­Ô¶Àë¾ö²ß±ß½çµÄµãÒѾ­²»ÖØÒªÁË¡£Òò´Ë£¬ÓÃÓÚÈ·¶¨Ñù±¾ÓëÍøÂçϲãÖеľö²ß±ß½çÖ®¼äµÄ¾àÀëµÄÓÐЧ»úÖÆ½«Ê¹µÃÄܹ»ÔÚ²»½«Ñù±¾Í¶Ó°µ½¸üÉîµÄǶÈë¿Õ¼äÖУ¬¶ÔÕâЩÑù±¾Ö´ÐÐÔçÆÚÔ¤²â³ÉΪ¿ÉÄÜ¡£ÕâÖÖ·½·¨½«´ó´ó½µµÍÔ¤²âµÄƽ¾ù¼ÆËã³É±¾¡£È»¶ø£¬Éè¼ÆÒ»ÖÖÓÐЧµÄ·½·¨À´È·¶¨Ñù±¾ÊÇ·ñΪ±ß½çÑù±¾ÊÇÒ»¸ö¾ßÓÐÌôÕ½ÐÔµÄÎÊÌâ¡£

ÕâÀÎÒÃǽ«ÔçÆÚÔ¤²âÎÊÌâÃèÊöΪ·çÏÕ×îС»¯ÎÊÌ⣬²¢ÒýÈëÒ»×éÖ±½Ó¼¯³Éµ½Éî²ãÉñ¾­ÍøÂ磨²Î¼ûͼ1£©µÄµ¥²ãǰÏò´«²¥ÍøÂ磨ÎÒÃdzÆÎª¾ö²ßÃÅ£©¡£¾ö²ßÃÅÄ£¿éµÄÄ¿±ê²»½öÊÇÈ·¶¨Ñù±¾ÊÇ·ñÐèҪͶӰµ½ÉîǶÈë¿Õ¼äÖУ¬¶øÇÒ»¹×îС»¯ÔçÆÚ´íÎó·ÖÀàµÄ·çÏÕ¡£¾ßÌå¶øÑÔ£¬ÎÒÃÇѵÁ·¾ö²ßÃÅÄ£¿é£¬¸Ã¾ö²ßÃÅÄ£¿éͨ¹ý½ÂÁ´ËðºÄ[5]¼¯³Éµ½Éî¶ÈÉñ¾­ÍøÂçÖУ¬¸Ã½ÂÁ´ËðºÄ[5]ʹµÃÔڽϵÍǶÈëÖÐÔçÆÚÎó·ÖÀàµÄ·çÏÕ×îС»¯£¬Í¬Ê±¾ö¶¨Ñù±¾ÊÇ·ñÊDZ߽çÑù±¾£º

ÆäÖУ¬y ÊÇÊäÈëÊý¾ÝxµÄÕæÖµ±êÇ©£¬ ÊÇͨ¹ý¾ßÓÐÈ¨ÖØ w ºÍÆ«Öà b ¼¯ºÏµÄ¾ö²ßÃÅÄ£¿éÔ¤²âµÄÀà±êÇ©¡£È¨ÖØ w µÄ¼¯ºÏ¾ßÓÐ fxc µÄάÊý£¬ÆäÖУ¬f ±íʾÏò¾ö²ßÃÅÄ£¿éµÄÊäÈëÌØÕ÷µÄÊýÁ¿£¬c ±íʾ·ÖÀàÈÎÎñÖеÄÀà±êÇ©µÄÊýÁ¿¡£¸Ã¾ö²ßÃÅÄ£¿éÌṩÁËÖØÒªµÄÓÅÊÆ£¬ÆäÖеĽá¹û wTx-b ÌṩÁËÑù±¾µ½Ç¶Èë¿Õ¼äÖÐÿ¸öÀà±êÇ©µÄÏàÓ¦¾ö²ß±ß½çµÄ¾àÀë¡£ÒÔÕâÖÖ·½Ê½ÑµÁ·¾ö²ßÃÅÄ£¿éÌṩÁËÏßÐÔ·ÖÀàÆ÷£¬ÆäÖв»ÐèÒª¸üÉîǶÈëÒÔ½øÐÐÇø·ÖµÄÑù±¾ÊÇÄÇЩÓë¾ö²ß±ß½ç¾ßÓнϴó¾àÀ루¼´ÎªÕýºÅ£©µÄÑù±¾¡£ÖµµÃ×¢ÒâµÄÊÇ£¬¾ö²ßÃÅÄ£¿éµÄµ¥²ãÌØÐÔÖ¼ÔÚ¿¼ÂÇЧÂÊ¡£

¾ö²ßÃÅÄ£¿éÊÇͨ¹ýÓÃÓÚѵÁ·Éî²ãÉñ¾­ÍøÂçµÄѵÁ·Êý¾ÝÀ´ÑµÁ·µÄ£¬Ã¿¸ö¾ö²ßÃÅÄ£¿éµÄÄ¿±êÊÇ×îС»¯ÑµÁ·Êý¾ÝµÄ·ÖÀàÎó²î¡£Òò´Ë£¬ÑµÁ·Êý¾ÝÉϵÄËðʧº¯Êý¿ÉÒÔ±íÊöΪ£º

ÆäÖÐ Y ±íʾËùÓÐѵÁ·Êý¾ÝµÄµØÃæÊµ¿ö±ê¼Ç¼¯¡£¹ØÓÚ £¬×îÓÐȤµÄÊÇ ÊÇ w ºÍ b µÄ͹º¯Êý£¬Òò´Ë¿ÉÒÔ¡£Í¨¹ýÌݶÈϽµÓÅ»¯¡£Òò´Ë£¬ÕâÀï¿ÉÒÔ²ÉÓô«Í³µÄÌݶÈϽµ£¬ÆäÖÐÑØ×Å´Óº¯ÊýµÄ´ÎÌݶÈ[15]ÖÐÑ¡ÔñµÄÏòÁ¿µÄ·½Ïò²ÉÈ¡²½ÖèÒÔÕÒµ½ÓÅ»¯Öµ¡£Òò´Ë£¬¾ö²ßÃÅ¿ÉÒÔÔÚСÅúÁ¿ÑµÁ·¿ò¼ÜϽøÐÐѵÁ·£¬ÕâʹµÃËüÔÚ¾ßÓдóÊý¾Ý¼¯µÄÉî²ãÉñ¾­ÍøÂçѵÁ·ÖеÄÓ¦Ó÷dz£·½±ã¡£

ÔÚ±¾ÖÊÉÏ£¬ËùÌá³öµÄ¾ö²ßÃÅÄ£¿é¿ÉÒÔ»ùÓÚ wTx-b ¼ÆËãÿ¸öÑù±¾µ½¾ö²ß±ß½çµÄ¾àÀ룻¼ÆËã³öµÄ¾àÀëÓë¸÷¾ö²ßÞö²ßãÐÖµt±È½ÏÒÔÈ·¶¨ÊÇ·ñÐèÒª¶ÔÑùÆ·ÔÚ¾ö²ßÃŽøÐÐÔçÆÚÔ¤²â£¬»òÕß½«Ñù±¾Òƶ¯µ½Éî¶ÈÉñ¾­ÍøÂçµÄ¸üÉîµÄÍøÂç½×²ãÀ´Ìá¸ßÔ¤²âµÄЧ¹û¡£Ô¶Àë¾ö²ß±ß½çµÄÑù±¾µ¼ÖÂÔÚ wTx-b ÖÐÊä³ö½Ï´óÖµ£»Òò´Ë£¬Èç¹ûÑù±¾µÄ¾ö²ßÞàÀëÂú×ã¾ö²ßÞö²ßãÐÖµ£¬Ôò¶ÔÓ¦ÓÚ×î´ó¾àÀëµÄÀà±»·ÖÅäΪ¸ÃÔçÆÚÔ¤²â²½ÖèÖÐÑù±¾µÄÔ¤²âÀà±êÇ©¡£

ͼ2£º¾«È·¶ÈÓë FLOP µÄÊýÁ¿£ºÍ¨¹ý½¨ÒéµÄ½ÂÁ´ËðʧѵÁ·Óоö²ßÃŵÄÍøÂçµÄÐÔÄÜÓëͨ¹ý´«Í³µÄ½»²æìØËðʧ·½·¨ÑµÁ·µÄ¾ö²ßÃŽøÐбȽϡ£¿ÉÒÔ¿´³ö£¬Í¨¹ý½ÂÁ´ËðºÄѵÁ·µÄ¾ö²ßÃűÈʹÓý»²æìØËðºÄʱ¾ßÓиü¸ßµÄ¼ÆËãЧÂʺ͸ü¸ßµÄ¾«¶È

±í1£ºReNET-101 ºÍ DeNeNET-201 µÄʵÑé½á¹ûÓ벻ͬµÄ¾ö²ßÃÅÅäÖ᣽«Ã¿¸öÅäÖõĴ¥·¢Æ÷µÄƽ¾ùÊýºÍ¾«¶ÈÓëÔ­Ê¼ÍøÂçµÄƽ¾ùÊý½øÐбȽϡ£¾ö²ßÃÅ£¨T1£¬T2£©±íʾ¾ßÓÐÁ½¸ö¾ö²ßÃÅÄ£¿éµÄÍøÂ磬·Ö±ðÅäÖÃÓоö²ßãÐÖµ T1 ºÍ T2¡£

Èý¡¢½á¹ûÓëÌÖÂÛ

ÔÚ CIFAR10 Êý¾Ý¼¯ÉÏ£¬Ê¹ÓÃÁ½ÖÖ²»Í¬µÄÍøÂçÌåϵ½á¹¹£¨ResNet101[7]ºÍDenseNet201[9]£©À´¼ìÑéËùÌá³öµÄ¾ö²ßÃÅÄ£¿éµÄÓÐЧÐÔ¡£¸Ã¾ö²ßÃÅÄ£¿éµÄÒ»¸öÖ÷ÒªÓŵãÊÇËüÄܹ»Í¨¹ýµ÷Õû d Þö²ßãÐÖµ£¬½ø¶ø¾«È·¿ØÖƽ¨Ä£¾«¶ÈºÍ¼ÆËã³É±¾Ö®¼äµÄƽºâ¡£Í¨¹ý½µµÍ¾ö²ßÞö²ßãÐÖµ£¬Ôö¼Ó½øÐÐÔçÆÚÔ¤²âµÄÑù±¾Êý£¬´Ó¶ø´ó´ó½µµÍÍøÂçµÄÔ¤²âƽ¾ù¼ÆËã³É±¾¡£ÔÚÕâÏîÑо¿ÖУ¬ÎÒÃǽáºÏÁ½ Resnet-101 ¾ö²ßÃÅÄ£¿é£¨ÔÚµÚÒ»ºÍµÚ¶þÖ÷¿é£©ºÍ Densenet-201£¨ºóµÄµÚÒ»ºÍµÚ¶þµÄÖÂÃܿ飩£¬²¢Ì½ÌÖ²»Í¬µÄ¾ö²ßÃÅÅäÖá£ÔÚ Pytorch ¿ò¼ÜÖÐʵÏÖÍøÂ磬²¢ÇÒ»ùÓÚµ¥¸ö Nvidia TITAN XP GPU ±¨¸æÔ¤²âËÙ¶È¡£

´Ó±í 1 ÖпÉÒԹ۲쵽£¬Í¨¹ý¼¯³É¾ßÓÐ(t1£¬t2)=(2.5£¬2.5)µÄ¾ö²ßãÐÖµµÄÁ½¸ö¾ö²ßÃÅÄ£¿é£¬ResNet ÍøÂçµÄ¼ÆËã³É±¾¿É¼õÉÙ 67MFLOPS£¬Í¬Ê±±£³ÖÓëԭʼ ResNet-101 ÏàͬµÄ¾«¶Èˮƽ¡£¾ö²ßÃÅÄ£¿éµÄ¼¯³É¿ÉÒÔʹ ResNet-101 ÍøÂçµÄ¼ÆËã³É±¾½µµÍ 39%£¨¼´½µµÍ1.95GFLOPS£©£¬ÓëԭʼµÄ ResNet-101£¨ÔÚ¾ö²ßÃÅ 1 ºÍ¾ö²ßÃÅ 2 ÖоßÓоàÀëãÐÖµ(t1£¬t2)=(1.0£¬2.0)£©Ïà±È£¬×¼È·¶È½µµÍ 1.7%£¬µ¼Ö 38% µÄ¼ÓËÙ¡£¶Ô DenseNet-201 µÄÊÔÑé±íÃ÷£¬ÔÚ¾«¶ÈÖ»ÓÐ 2£¥ ϽµµÄÇé¿öÏ£¬½« FLOP µÄÊýÁ¿¼õÉÙ 970MFLOP£¨36£¥£©ÊÇ¿ÉÄܵ쬴Ӷø¼ÓËÙ 46£¥¡£´ËÍ⣬ÔÚ¾«¶È 3% µÄ·¶Î§ÄÚ£¬Ê¹Óþö²ßÃÅÄ£¿é¿ÉÒÔ´ïµ½ÓëÔ­À´µÄ DenseNet-201 Ïà±È 2.3 ±¶µÄ¼ÓËÙ¡£»ùÓÚʵÑé½á¹û£¬Ìá³öµÄ¾ö²ßÃÅÄ£¿éµ¼ÖÂÔ¤²âËÙ¶ÈÏÔÖøÔö¼Ó£¬Ê¹µÃËü·Ç³£ÊʺÏÓÚ¹¤ÒµÓ¦Óá£

³ýÁËÌá³öµÄ¾ö²ßÃÅÄ£¿éÍ⣬±¾ÎĵÄÖ÷Òª¹±Ï×Ö®Ò»ÊÇÒýÈëÁËÓÃÓÚѵÁ·¾ö²ßÃÅÄ£¿éµÄ½ÂÁ´ËðºÄ¡£¹ýÈ¥µÄÑо¿[11]ÈÏΪ½»²æìØÔÚ¾ö²ß±ß½çºÍѵÁ·Êý¾ÝÖ®¼ä²úÉúµÄ²î¾àºÜС¡£Òò´Ë£¬ÓÉÓÚ Softmax Êä³öÖÐûÓÐÓмÛÖµµÄÐÅÏ¢£¬ËùÒÔºÜÄÑÐÅÈÎ Softmax ²ãµÄÖÃÐÅÖµÀ´¾ö¶¨Ñù±¾¡£ÎªÁËÑéÖ¤ËùÌá³öµÄ¾ö²ßÃÅÖнÂÁ´ËðºÄÓë½»²æìØËðºÄÏà±ÈµÄÓÐЧÐÔ£¬½øÐÐÁ˶îÍâµÄ¶Ô±ÈʵÑé¡£¸ü¾ßÌ嵨£¬Á½¸ö¾ö²ßÃÅÒÔÓ뱨¸æÏàͬµÄ·½Ê½Ìí¼Óµ½ ResNET101¡£È»¶ø£¬²»ÊÇѵÁ·Ê¹Óý¨ÒéµÄ½ÂÁ´ËðºÄ£¬¶øÊÇͨ¹ý½»²æìØËðʧÀ´ÑµÁ·¾ö²ßÃÅ¡£ÕâʹÎÒÃÇÄܹ»±È½Ï½ÂÁ´ËðºÄÓë½»²æìØËðʧ¶Ô¾ö²ßÃŹ¦ÄܵÄÓ°Ïì¡£

ͼ 2 ÏÔʾÁË»ùÓÚËùÌáÒéµÄ½ÂÁ´Ëðʧ·¨ÑµÁ·¾ö²ßÃŵÄÍøÂçµÄ¾«È·¶ÈÓë FLOP µÄÊýÁ¿£¬ÓëʹÓ󣹿½»²æìØÑµÁ·¹ý³ÌѵÁ·Ïà±È¡£¿ÉÒԹ۲쵽£¬ÔÚÍøÂçÖоßÓÐÏàͬÊýÁ¿µÄ FLOP µÄÇé¿öÏ£¬»ùÓÚËùÌá³öµÄ½ÂÁ´ËðºÄѵÁ·¾ö²ßÃŵÄÍøÂçÓëͨ¹ý½»²æìØËðºÄѵÁ·µÄÍøÂçÏà±È£¬ÌṩÁ˸ü¸ßµÄ½¨Ä£¾«¶È¡£µ±ÅоöÃű»ÅäÖóÉʹµÃÍøÂçʹÓýÏÉÙÊýÁ¿µÄ´¥·¢Æ÷ʱ£¬¾«¶È¼ä϶³ÊÖ¸ÊýÔö¼Ó¡£Õâ˵Ã÷ÁËÉÏÊöʹÓý»²æìØËðʧºÍ¾ö²ß±ß½çµÄÎÊÌâ¡£

²Î¿¼ÎÄÏ×

[1] Emmanuel Bengio, Pierre-Luc Bacon, Joelle Pineau, and Doina Precup.Conditional computation in neural networks for faster models. arXiv preprintarXiv:1511.06297, 2015.

[2] Tolga Bolukbasi, Joseph Wang, Ofer Dekel, and Venkatesh Saligrama.Adaptive neural networks for ef?cient inference. arXivpreprint arXiv:1702.07811, 2017.

[3] Corinna Cortes and Vladimir Vapnik. Support-vector networks. Machinelearning, 20(3):273¨C297, 1995.

[4] Ludovic Denoyer and PatrickGallinari. Deep sequential neural network.arXiv preprint arXiv:1410.0510, 2014.

[5] ¨¹r¨¹n Dogan, Tobias Glasmachers, and Christian Igel. A uni?ed viewon multi-class support vector classi?cation. Journal ofMachine Learning Research, 17(45):1¨C32, 2016.

[6] Song Han, Huizi Mao, and William J Dally. Deep compression:Compressing deep neural networks with pruning, trained quantization and huffmancoding. arXiv preprint arXiv:1510.00149, 2015.

[7] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residuallearning for image recognition. In Proceedings of the IEEE conference oncomputer vision and pattern recognition, pages 770¨C778, 2016.

[8] Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. Distilling theknowledge in a neural network. arXiv preprint arXiv:1503.02531, 2015.

[9] Gao Huang, Zhuang Liu, Laurens Van Der Maaten, and Kilian QWeinberger. Densely connected convolutional networks. In CVPR, volume 1, page3, 2017.

[10] Benoit Jacob, Skirmantas Kligys, Bo Chen, Menglong Zhu, Matthew Tang,Andrew Howard, Hartwig Adam, and Dmitry Kalenichenko. Quantization and trainingof neural networks for ef?cientintegerarithmetic-only inference. arXiv preprint arXiv:1712.05877, 2017.

[11] Xuezhi Liang, Xiaobo Wang, Zhen Lei, Shengcai Liao, and Stan Z Li.Soft-margin softmax for deep classi?cation. In InternationalConference on Neural Information Processing, pages 413¨C421. Springer, 2017.

[12] Lanlan Liu and Jia Deng. Dynamic deep neural networks: Optimizingaccuracy-ef?ciency trade-offs by selective execution. arXivpreprint arXiv:1701.00299, 2017.

[13] M. Sha?ee, A. Mishra, and A. Wong. Deep learning withdarwin: Evolutionary synthesis of deep neural networks. arXiv:1606.04393, 2016.

[14] M. Sha?ee and A. Wong. Evolutionary synthesis of deepneural networks via synaptic cluster-driven genetic encoding. In NIPS Workshop,2016.

[15] Shai Shalev-Shwartz, Yoram Singer, Nathan Srebro, and Andrew Cotter.Pegasos: Primal estimated subgradient solver for svm. Mathematical programming,127(1):3¨C30, 2011.

[16] Karen Simonyan and Andrew Zisserman. Very deep convolutional networksfor large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.

[17] Surat Teerapittayanon, Bradley McDanel, and HT Kung. Branchynet: Fastinference via early exiting from deep neural networks. In Pattern Recognition(ICPR), 2016 23rd International Conference on, pages 2464¨C2469. IEEE, 2016.

[18] Zuxuan Wu, Tushar Nagarajan, Abhishek Kumar, Steven Rennie, Larry SDavis, Kristen Grauman, and Rogerio Feris. Blockdrop: Dynamic inference pathsin residual networks. In Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition, pages 8817¨C8826, 2018.

[19] Saining Xie, Ross Girshick, Piotr Doll¨¢r, Zhuowen Tu, and Kaiming He.Aggregated residual transformations for deep neural networks. In ComputerVision and Pattern Recognition (CVPR), 2017 IEEE Conference on, pages5987¨C5995. IEEE, 2017.

ºñ

ÊÆ

Æû

³µ

µã»÷ÔĶÁÔ­ÎÄ£¬²é¿´ÎÄÕ¡¸¿µÄζû´óѧ£ºÒ»ÖÖÓÃÓÚ²âÊÔ×Ô¶¯¼ÝÊ»Éî¶ÈѧϰµÄ¹¤¾ßDeepTest¡¹

¡ñ ×Ô¶¯¼ÝÊ»´«¸ÐÆ÷£º¹ÛËÄ·£¬Ìý°Ë·½£¬ÐÐǧÀï
¡ñ Ò»ÌõÀ´×ÔBMWµÄÁôÑÔ£º¡°°¢À­ÉϺ£¼û¡±
°Ù³µÈ«ËµÊ×Ò³

ÍÆ¼ö°å¿é