6.
034fNeuralNetNotesOctober28,2010Thesenotesareasupplementtomaterialpresentedinlecture.
Ilayoutthemathematicsmoreprettilyandextendtheanalysistohandlemultiple-neuronsperlayer.
Also,Idevelopthebackpropagationrule,whichisoftenneededonquizzes.
IuseanotationthatIthinkimprovesonpreviousexplanations.
Thereasonisthatthenotationhereplainlyassociateseachinput,output,andweightwithareadilyidentifiedneuron,aleft-sideoneandaright-sideone.
Whenyouarriveattheupdateformulas,youwillhavelesstroublerelatingthevariablesintheformulastothevariablesinadiagram.
Onetheotherhand,seeingyetanothernotationmayconfuseyou,soifyoualreadyfeelcomfortablewithasetofupdateformulas,youwillnotgainbyreadingthesenotes.
ThesigmoidfunctionThesigmoidfunction,y=1/(1+ex),isusedinsteadofastepfunctioninartificialneuralnetsbecausethesigmoidiscontinuous,whereasastepfunctionisnot,andyouneedcontinuitywheneveryouwanttousegradientascent.
Also,thesigmoidfunctionhasseveraldesirablequalities.
Forexample,thesigmoidfunction'svalue,y,approaches1asxbecomeshighlypositive;0asxbecomeshighlynegative;andequals1/2whenx=0.
Betteryet,thesigmoidfunctionfeaturesaremarkablysimplederivativeoftheoutput,y,withrespecttotheinput,x:dyd1=()dxdx1+exd=(1+ex)1dx=1*(1+ex)2*ex*11ex=*1+ex1+ex11+ex1=*1+ex1+ex11+ex11+ex1+ex1+ex=y(1y)Thus,remarkably,thederivativeoftheoutputwithrespecttotheinputisexpressedasasimplefunctionoftheoutput.
TheperformancefunctionThestandardperformancefunctionforgauginghowwellaneuralnetisdoingisgivenbythefollowing:1P=(dsampleosample)222wherePistheperformancefunction,dsampleisthedesiredoutputforsomespecificsampleandosampleistheobservedoutputforthatsample.
Fromthispointforward,assumethatdandoarethedesiredandobservedoutputsforaspecificsamplesothatweneednotdragasubscriptaroundasweworkthroughthealgebra.
ThereasonforchoosingthegivenformulaforPisthattheformulahasconvenientproperties.
Theformulayieldsamaximumato=dandmonotonicallydecreasesasodeviatesfromd.
Moreover,thederivativeofPwithrespecttooissimple:dPd1=[(do)2]dodo2=2*(do)1*12=doGradientascentBackpropagationisaspecializationoftheideaofgradientascent.
YouaretryingtofindthemaximumofaperformancefunctionP,bychangingtheweightsassociatedwithneurons,soyoumoveinthedirectionofthegradientinaspacethatgivesPasafunctionoftheweights,w.
Thatis,youmoveinthedirectionofmostrapidascentifwetakeastepinthedirectionwithcomponentsgovernedbythefollowingformula,whichshowshowmuchtochangeaweight,w,intermsofapartialderivative:PΔw∝wTheactualchangeisinuencedbyarateconstant,α;accordingly,thenewweight,w,isgivenbythefollowing:w=w+α*PwGradientdescentIftheperformancefunctionwere12(dsampleosample)2insteadof12(dsampleosample)2,thenyouwouldbesearchingfortheminimumratherthanthemaximumofP,andthechangeinwwouldbesubtractedfromwinsteadofadded,sowwouldbewα*wPinsteadofw+α*wP.
Thetwosignchanges,oneintheperformancefunctionandtheotherintheupdateformulacancel,sointheend,yougetthesameresultwhetheryouusegradientascent,asIprefer,orgradientdescent.
ThesimplestneuralnetConsiderthesimplestpossibleneuralnet:oneinput,oneoutput,andtwoneurons,theleftneuronandtherightneuron.
Anetwithtwoneuronsisthesmallestthatillustrateshowthederivativescanbecomputedlayerbylayer.
3xSigmoidWlplilolxSigmoidWrprorirLeftneuronRightneuronNotethatthesubscriptsindicatelayer.
Thus,il,wl,pl,andolaretheinput,weight,product,andoutputassociatedwiththeneuronontheleftwhileir,wr,pr,andoraretheinput,weight,product,andoutputassociatedwiththeneuronontheright.
Ofcourse,ol=ir.
Supposethattheoutputoftherightneuron,or,isthevaluethatdeterminesperformanceP.
TocomputethepartialderivativeofPwithrespecttotheweightintherightneuron,wr,youneedthechainrule,whichallowsyoutocomputepartialderivativesofonevariablewithrespecttoanotherintermsofanintermediatevariable.
Inparticular,forwr,youhavethefollowing,takingortobetheintermediatevariable:PPor=*wrorwrNow,youcanrepeat,usingthechain-ruletoturnworrintooprr*wprr:PPorpr=**wrorprwrConveniently,youhaveseentwoofthederivativesalready,andthethird,wprr=(wrw*rol),iseasytocompute:P=[(dor)]*[or(1or)]*[ir]wrRepeatingtheanalysisforwlyieldsthefollowing.
Eachlineisthesameasthepreviously,exceptthatonemorepartialderivativeisexpandedusingthechainrule:P=P*orwlorwl=P*or*prorprwl=P*or*pr*olorprolwl=P*or*pr*ol*plorprolplwl=[(dor)]*[or(1or)]*[wr]*[ol(1ol)]*[il]4Thus,thederivativeconsistsofproductsoftermsthathavealreadybeencomputedandtermsinthevicinityofwl.
Thisisclearerifyouwritethetwoderivativesnexttooneanother:P=(dor)*or(1or)*irwrP=(dor)*or(1or)*wr*ol(1ol)*ilwlYoucansimplifytheequationsbydefiningδsasfollows,whereeachdeltaisassociatedwitheithertheleftorrightneuron:δr=or(1or)*(dor)δl=ol(1ol)*wr*δrThen,youcanwritethepartialderivativeswiththeδs:P=ir*δrwrP=il*δlwlIfyouaddmorelayerstothefrontofthenetwork,eachweighthasapartialderivativesthatiscomputedlikethepartialderivativeoftheweightoftheleftneuron.
Thatis,eachhasapartialderivativedeterminedbyitsinputanditsdelta,whereitsdeltainturnisdeterminedbyitsoutput,theweighttoitsright,andthedeltatoitsright.
Thus,fortheweightsinthefinallayer,youcomputethechangeasfollows,whereIusefasthesubscriptinsteadofrtoemphasizethatthecomputationisfortheneuroninthefinallayer:Δwf=α*if*δfwhereδf=of(1of)*(dof)Forallotherlayers,youcomputethechangeasfollows:Δwl=α*il*δlwhereδl=ol(1ol)*wr*δrMoreneuronsperlayersOfcourse,youreallywantbackpropagationformulasfornotonlyanynumberoflayersbutalsoforanynumberofneuronsperlayer,eachofwhichcanhavemultipleinputs,eachwithitsownweight.
Accordingly,youneedtogeneralizeinanotherdirection,allowingmultipleneuronsineachlayerandmultipleweightsattachedtoeachneuron.
Thegeneralizationisanadventureinsummations,withlotsofsubscriptstokeepstraight,butintheend,theresultmatchesintuition.
Forthefinallayer,theremaybemanyneurons,sotheformula'sneedanindex,k,indicatingwhichfinalnodeneuronisinplay.
Foranyweightcontained5inthefinal-layerneuron,fk,youcomputethechangeasfollowsfromtheinputcorrespondingtotheweightandfromtheδassociatedwiththeneuron:Δw=α*i*δfkδfk=ofk(1ofk)*(dkofk)Notethattheoutputofeachfinal-layerneuronoutputissubtractedfromtheoutputdesiredforthatneuron.
Forotherlayers,theremayalsobemanyneurons,andtheoutputofeachmayinuencealltheneuronsinthenextlayertotheright.
Thechangeinweighthastoaccountforwhathappenstoallofthoseneuronstotheright,soasummationappears,butotherwiseyoucomputethechange,asbefore,fromtheinputcorrespondingtotheweightandfromtheδassociatedwiththeneuron:Δw=α*i*δliδli=oli(1oli)*wli→rj*δrjjNotethatwli→rjistheweightthatconnectsthejthright-sideneurontotheoutputoftheithleft-sideneuron.
SummaryOnceyouunderstoodhowtoderivetheformulas,youcancombineandsimplifytheminpreparationforsolvingproblems.
Foreachweight,youcomputetheweight'schangefromtheinputcorrespondingtotheweightandfromtheδassociatedwiththeneuron.
Assumingthatδisthedeltaassociatedwiththatneuron,youhavethefollowing,wherew→rjistheweightconnectingtheoutputoftheneuronyouareworkingon,theithleft-sideneuron,tothejthright-sideneuron,andδrjistheδassociatedwiththatright-sideneuron.
δo=o(1o)*(do)forthefinallayerδli=oli(1oli)*wli→rj*δrjotherwisejThatis,youcomputedchangeinaneuron'sw,ineverylayer,bymultiplyingαtimestheneuron'sinputtimesitsδ.
Theδisdeterminedforallbutthefinallayerintermsoftheneuron'soutputandalltheweightsthatconnectthatoutputtoneuronsinthelayertotherightandtheδsassociatedwiththoseright-sideneurons.
Theδforeachneuroninthefinallayerisdeterminedonlybytheoutputofthatneuronandbythedifferencebetweenthedesiredoutputandtheactualoutputofthatneuron.
6WeightsanddeltasinlayertotherightNeuronwithweighttobeadjustedw→r1wxoixxΣ∫w→rNWeighttobeadjustedxxxΣ∫δ1xxxΣ∫δΝMITOpenCourseWarehttp://ocw.
mit.
edu6.
034ArtificialIntelligenceFall2010ForinformationaboutcitingthesematerialsorourTermsofUse,visit:http://ocw.
mit.
edu/terms.
厦门靠谱云股份有限公司 双十一到了,站长我就给介绍一家折扣力度名列前茅的云厂商——萤光云。1H2G2M的高防50G云服务器,依照他们的规则叠加优惠,可以做到12元/月。更大配置和带宽的价格,也在一般云厂商中脱颖而出,性价比超高。官网:www.lightnode.cn叠加优惠:全区季付55折+满100-50各个配置价格表:地域配置双十一优惠价说明福州(带50G防御)/上海/北京1H2G2M12元/月...
Cloudxtiny是一家来自英国的主机商,提供VPS和独立服务器租用,在英国肯特自营数据中心,自己的硬件和网络(AS207059)。商家VPS主机基于KVM架构,开设在英国肯特机房,为了庆祝2021年欧洲杯决赛英格兰对意大利,商家为全场VPS主机提供50%的折扣直到7月31日,优惠后最低套餐每月1.5英镑起。我们对这场比赛有点偏见,但希望这是一场史诗般的决赛!下面列出几款主机套餐配置信息。CPU...
久久网云怎么样?久久网云好不好?久久网云是一家成立于2017年的主机服务商,致力于为用户提供高性价比稳定快速的主机托管服务,久久网云目前提供有美国免费主机、香港主机、韩国服务器、香港服务器、美国云服务器,香港荃湾CN2弹性云服务器。专注为个人开发者用户,中小型,大型企业用户提供一站式核心网络云端服务部署,促使用户云端部署化简为零,轻松快捷运用云计算!多年云计算领域服务经验,遍布亚太地区的海量节点为...
mimiai.net为你推荐
leavemealone女孩说leave me alone还有戏吗?求帮助硬盘工作原理硬盘的读写原理比肩工场比肩接踵的意思杰景新特谁给我一个李尔王中的葛罗斯特这个人物的分析?急 ....先谢谢了haole018.comse.haole004.com为什么手机不能放?www.33xj.compro/engineer 在哪里下载,为什么找不到下载网站?www.bbb551.combbb是什么意思sodu.tw给个看免费小说的网站本冈一郎本冈一郎的官网说是日本相扑用的,我们平常的人增肥可以吗?彪言彪语()言() 语
已备案未注册域名 域名备案只选云聚达 3322免费域名 com域名抢注 plesk pw域名 监控宝 网通ip 三拼域名 泉州电信 美国在线代理服务器 免费phpmysql空间 银盘服务是什么 华为云建站 lamp什么意思 美国迈阿密 如何登陆阿里云邮箱 中国电信宽带测速 七十九刀 贵州电信 更多