desiredmimiai.net

mimiai.net  时间:2021-04-07  阅读:()
6.
034fNeuralNetNotesOctober28,2010Thesenotesareasupplementtomaterialpresentedinlecture.
Ilayoutthemathematicsmoreprettilyandextendtheanalysistohandlemultiple-neuronsperlayer.
Also,Idevelopthebackpropagationrule,whichisoftenneededonquizzes.
IuseanotationthatIthinkimprovesonpreviousexplanations.
Thereasonisthatthenotationhereplainlyassociateseachinput,output,andweightwithareadilyidentifiedneuron,aleft-sideoneandaright-sideone.
Whenyouarriveattheupdateformulas,youwillhavelesstroublerelatingthevariablesintheformulastothevariablesinadiagram.
Onetheotherhand,seeingyetanothernotationmayconfuseyou,soifyoualreadyfeelcomfortablewithasetofupdateformulas,youwillnotgainbyreadingthesenotes.
ThesigmoidfunctionThesigmoidfunction,y=1/(1+ex),isusedinsteadofastepfunctioninartificialneuralnetsbecausethesigmoidiscontinuous,whereasastepfunctionisnot,andyouneedcontinuitywheneveryouwanttousegradientascent.
Also,thesigmoidfunctionhasseveraldesirablequalities.
Forexample,thesigmoidfunction'svalue,y,approaches1asxbecomeshighlypositive;0asxbecomeshighlynegative;andequals1/2whenx=0.
Betteryet,thesigmoidfunctionfeaturesaremarkablysimplederivativeoftheoutput,y,withrespecttotheinput,x:dyd1=()dxdx1+exd=(1+ex)1dx=1*(1+ex)2*ex*11ex=*1+ex1+ex11+ex1=*1+ex1+ex11+ex11+ex1+ex1+ex=y(1y)Thus,remarkably,thederivativeoftheoutputwithrespecttotheinputisexpressedasasimplefunctionoftheoutput.
TheperformancefunctionThestandardperformancefunctionforgauginghowwellaneuralnetisdoingisgivenbythefollowing:1P=(dsampleosample)222wherePistheperformancefunction,dsampleisthedesiredoutputforsomespecificsampleandosampleistheobservedoutputforthatsample.
Fromthispointforward,assumethatdandoarethedesiredandobservedoutputsforaspecificsamplesothatweneednotdragasubscriptaroundasweworkthroughthealgebra.
ThereasonforchoosingthegivenformulaforPisthattheformulahasconvenientproperties.
Theformulayieldsamaximumato=dandmonotonicallydecreasesasodeviatesfromd.
Moreover,thederivativeofPwithrespecttooissimple:dPd1=[(do)2]dodo2=2*(do)1*12=doGradientascentBackpropagationisaspecializationoftheideaofgradientascent.
YouaretryingtofindthemaximumofaperformancefunctionP,bychangingtheweightsassociatedwithneurons,soyoumoveinthedirectionofthegradientinaspacethatgivesPasafunctionoftheweights,w.
Thatis,youmoveinthedirectionofmostrapidascentifwetakeastepinthedirectionwithcomponentsgovernedbythefollowingformula,whichshowshowmuchtochangeaweight,w,intermsofapartialderivative:PΔw∝wTheactualchangeisinuencedbyarateconstant,α;accordingly,thenewweight,w,isgivenbythefollowing:w=w+α*PwGradientdescentIftheperformancefunctionwere12(dsampleosample)2insteadof12(dsampleosample)2,thenyouwouldbesearchingfortheminimumratherthanthemaximumofP,andthechangeinwwouldbesubtractedfromwinsteadofadded,sowwouldbewα*wPinsteadofw+α*wP.
Thetwosignchanges,oneintheperformancefunctionandtheotherintheupdateformulacancel,sointheend,yougetthesameresultwhetheryouusegradientascent,asIprefer,orgradientdescent.
ThesimplestneuralnetConsiderthesimplestpossibleneuralnet:oneinput,oneoutput,andtwoneurons,theleftneuronandtherightneuron.
Anetwithtwoneuronsisthesmallestthatillustrateshowthederivativescanbecomputedlayerbylayer.
3xSigmoidWlplilolxSigmoidWrprorirLeftneuronRightneuronNotethatthesubscriptsindicatelayer.
Thus,il,wl,pl,andolaretheinput,weight,product,andoutputassociatedwiththeneuronontheleftwhileir,wr,pr,andoraretheinput,weight,product,andoutputassociatedwiththeneuronontheright.
Ofcourse,ol=ir.
Supposethattheoutputoftherightneuron,or,isthevaluethatdeterminesperformanceP.
TocomputethepartialderivativeofPwithrespecttotheweightintherightneuron,wr,youneedthechainrule,whichallowsyoutocomputepartialderivativesofonevariablewithrespecttoanotherintermsofanintermediatevariable.
Inparticular,forwr,youhavethefollowing,takingortobetheintermediatevariable:PPor=*wrorwrNow,youcanrepeat,usingthechain-ruletoturnworrintooprr*wprr:PPorpr=**wrorprwrConveniently,youhaveseentwoofthederivativesalready,andthethird,wprr=(wrw*rol),iseasytocompute:P=[(dor)]*[or(1or)]*[ir]wrRepeatingtheanalysisforwlyieldsthefollowing.
Eachlineisthesameasthepreviously,exceptthatonemorepartialderivativeisexpandedusingthechainrule:P=P*orwlorwl=P*or*prorprwl=P*or*pr*olorprolwl=P*or*pr*ol*plorprolplwl=[(dor)]*[or(1or)]*[wr]*[ol(1ol)]*[il]4Thus,thederivativeconsistsofproductsoftermsthathavealreadybeencomputedandtermsinthevicinityofwl.
Thisisclearerifyouwritethetwoderivativesnexttooneanother:P=(dor)*or(1or)*irwrP=(dor)*or(1or)*wr*ol(1ol)*ilwlYoucansimplifytheequationsbydefiningδsasfollows,whereeachdeltaisassociatedwitheithertheleftorrightneuron:δr=or(1or)*(dor)δl=ol(1ol)*wr*δrThen,youcanwritethepartialderivativeswiththeδs:P=ir*δrwrP=il*δlwlIfyouaddmorelayerstothefrontofthenetwork,eachweighthasapartialderivativesthatiscomputedlikethepartialderivativeoftheweightoftheleftneuron.
Thatis,eachhasapartialderivativedeterminedbyitsinputanditsdelta,whereitsdeltainturnisdeterminedbyitsoutput,theweighttoitsright,andthedeltatoitsright.
Thus,fortheweightsinthefinallayer,youcomputethechangeasfollows,whereIusefasthesubscriptinsteadofrtoemphasizethatthecomputationisfortheneuroninthefinallayer:Δwf=α*if*δfwhereδf=of(1of)*(dof)Forallotherlayers,youcomputethechangeasfollows:Δwl=α*il*δlwhereδl=ol(1ol)*wr*δrMoreneuronsperlayersOfcourse,youreallywantbackpropagationformulasfornotonlyanynumberoflayersbutalsoforanynumberofneuronsperlayer,eachofwhichcanhavemultipleinputs,eachwithitsownweight.
Accordingly,youneedtogeneralizeinanotherdirection,allowingmultipleneuronsineachlayerandmultipleweightsattachedtoeachneuron.
Thegeneralizationisanadventureinsummations,withlotsofsubscriptstokeepstraight,butintheend,theresultmatchesintuition.
Forthefinallayer,theremaybemanyneurons,sotheformula'sneedanindex,k,indicatingwhichfinalnodeneuronisinplay.
Foranyweightcontained5inthefinal-layerneuron,fk,youcomputethechangeasfollowsfromtheinputcorrespondingtotheweightandfromtheδassociatedwiththeneuron:Δw=α*i*δfkδfk=ofk(1ofk)*(dkofk)Notethattheoutputofeachfinal-layerneuronoutputissubtractedfromtheoutputdesiredforthatneuron.
Forotherlayers,theremayalsobemanyneurons,andtheoutputofeachmayinuencealltheneuronsinthenextlayertotheright.
Thechangeinweighthastoaccountforwhathappenstoallofthoseneuronstotheright,soasummationappears,butotherwiseyoucomputethechange,asbefore,fromtheinputcorrespondingtotheweightandfromtheδassociatedwiththeneuron:Δw=α*i*δliδli=oli(1oli)*wli→rj*δrjjNotethatwli→rjistheweightthatconnectsthejthright-sideneurontotheoutputoftheithleft-sideneuron.
SummaryOnceyouunderstoodhowtoderivetheformulas,youcancombineandsimplifytheminpreparationforsolvingproblems.
Foreachweight,youcomputetheweight'schangefromtheinputcorrespondingtotheweightandfromtheδassociatedwiththeneuron.
Assumingthatδisthedeltaassociatedwiththatneuron,youhavethefollowing,wherew→rjistheweightconnectingtheoutputoftheneuronyouareworkingon,theithleft-sideneuron,tothejthright-sideneuron,andδrjistheδassociatedwiththatright-sideneuron.
δo=o(1o)*(do)forthefinallayerδli=oli(1oli)*wli→rj*δrjotherwisejThatis,youcomputedchangeinaneuron'sw,ineverylayer,bymultiplyingαtimestheneuron'sinputtimesitsδ.
Theδisdeterminedforallbutthefinallayerintermsoftheneuron'soutputandalltheweightsthatconnectthatoutputtoneuronsinthelayertotherightandtheδsassociatedwiththoseright-sideneurons.
Theδforeachneuroninthefinallayerisdeterminedonlybytheoutputofthatneuronandbythedifferencebetweenthedesiredoutputandtheactualoutputofthatneuron.
6WeightsanddeltasinlayertotherightNeuronwithweighttobeadjustedw→r1wxoixxΣ∫w→rNWeighttobeadjustedxxxΣ∫δ1xxxΣ∫δΝMITOpenCourseWarehttp://ocw.
mit.
edu6.
034ArtificialIntelligenceFall2010ForinformationaboutcitingthesematerialsorourTermsofUse,visit:http://ocw.
mit.
edu/terms.

MOACK:韩国服务器/双E5-2450L/8GB内存/1T硬盘/10M不限流量,$59.00/月

Moack怎么样?Moack(蘑菇主机)是一家成立于2016年的商家,据说是国人和韩国合资开办的主机商家,目前主要销售独立服务器,机房位于韩国MOACK机房,网络接入了kt/lg/kinx三条线路,目前到中国大陆的速度非常好,国内Ping值平均在45MS左右,而且商家的套餐比较便宜,针对国人有很多活动。不过目前如果购买机器如需现场处理,由于COVID-19越来越严重,MOACK办公楼里的人也被感染...

RFCHOST - 洛杉矶CN2 GIA VPS季付23.9美元起 100Mbps带宽

RFCHOST,这个服务商我们可能有一些朋友知道的。不要看官网是英文就以为是老外服务商,实际上这个服务商公司在上海。我们实际上看到的很多商家,有的是繁体,有的是英文,实际上很多都是我们国人朋友做的,有的甚至还做好几个品牌域名,实际上都是一个公司。对于RFCHOST商家还是第一次分享他们家的信息,公司成立大约2015年左右。目前RFCHOST洛杉矶机房VPS正进行优惠促销,采用CN2优化线路,电信双...

SugarHosts糖果主机,(67元/年)云服务器/虚拟主机低至半价

SugarHosts 糖果主机商也算是比较老牌的主机商,从2009年开始推出虚拟主机以来,目前当然还是以虚拟主机为主,也有新增云服务器和独立服务器。早年很多网友也比较争议他们家是不是国人商家,其实这些不是特别重要,我们很多国人商家或者国外商家主要还是看重的是品质和服务。一晃十二年过去,有看到SugarHosts糖果主机商12周年的促销活动。如果我们有需要香港、美国、德国虚拟主机的可以选择,他们家的...

mimiai.net为你推荐
敬汉卿姓名被抢注为什么最近b站up主都被问是否注册了商标?Baby被问婚变绯闻小s在黄晓明婚礼上问了什么问题微信回应封杀钉钉为什么微信被封以后然后解封了过了一会又被封了地图应用看卫星地图哪个手机软件最好。梦之队官网史上最强的nba梦之队是哪一年www.hao360.cn搜狗360导航网址是什么www.522av.com现在怎样在手机上看AV同一服务器网站一个服务器能运行多少个网站336.com求那个网站 你懂得 1552517773@qqwww.55125.cnwww95599cn余额查询
域名注册中心 x3220 国外bt 免费ftp空间 密码泄露 云主机51web debian源 腾讯云分析 hktv 新世界服务器 安徽双线服务器 最漂亮的qq空间 闪讯官网 google台湾 114dns 中国联通宽带测试 后门 97rb 存储服务器 博客域名 更多