tier雅虎免费邮箱

雅虎免费邮箱  时间:2021-02-24  阅读:()
ICICExpressLettersICICInternational2010ISSN1881-803XVolume4,Number5,October2010pp.
1–9AnUnconstrainedOptimizationMethodBasedonBPNeuralNetworkFulinWang*,HuixiaZhu,JiquanWangCollegeofEngineeringNortheastAgriculturalUniversity,Harbin150030,ChinaEmail:fulinwang@yahoo.
com.
cnABSTRACT.
Anunconstrainedoptimizationmethodisproposedinthispaper,basedonbackpropagation(BP)neuralnetwork.
Theoptimizationmethodismainlyappliedtosolvingtheblackboxproblem.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderivedandanalgorithmbasedontheunconstrainedoptimizationmethodofBpneuralnetworkisproposed.
Themodelisvalidatedbydemonstrationofsamplecalculationandtheresultsshowthatthealgorithmisaneffectivemethod.
Keywords:BPneuralnetwork;Unconstraint;model;optimizationmethod1.
Introduction.
BackPropagation(BP)neuralnetworkisoneoftheimportantresearchfieldsinintelligentcontrolandintelligentautomation[1,2].
BPneuralnetworkiscomposedofmanysimpleparallelalgorithmmodules,whicharesimilarwithbiologicalneuralsystemneurons.
Neuralnetworkisanonlineardynamicsystemwhichischaracterizedbydistributedinformationstorageandparallelsynergistictreatment[3].
Thestructureofasingleneuronissimpleanditsfunctionislimited.
However,anetworksystemwhichcontainsalargenumberofneuronshasvariousfunctionsandcanbeusedinmanyapplications.
BPneuralnetworkmodelwhichisoneofthemostimportantartificialneuralnetworkmodelsisamulti-layerforward-neuralnetwork,whichismostwidelystudiedandusedatpresent.
Theorieshaveprovedthatifathree-layerBPneuralnetworkhasenoughhiddenlayernodes,itcansimulateanycomplexnonlinearmapping[4-7].
ThisindicatesthatBPneuralnetworkfitsrathereasily.
TheoptimizationresearchofBPneuralnetworkisrecordedinsomedocuments.
Peoplehavedonealotofstudiesinsuchaspectsaslearningrate,weight,threshold,andnetworkstructureoptimizationetc,inordertosolvetheproblemsofBPneuralnetworkwhichincludethefactorsofthefluctuations,theoscillation,theslowingoffittingspeed,theinfinitenetworkstructureandsoon[8-15].
Butintheactualsituation,peoplearenotonlyconcernedaboutthefittingeffectofneuralnetworks,butalsoabouthowtoachievethemaximumandminimumoutputvaluesbyadjustinginputvalues.
ThepresentliteratureaboutBPneuralnetworkoptimizationismainlyaboutidentifyingthecorrespondingrelationbetweenBPneuralnetworkinputandoutputtoobtaintherequiredoutputvalues[16-19].
Actuallythesestudiescanbeconsideredassimulationbutnotoptimizationbecausetheselectedoptimumprogramisbasedonthesimulationresults.
Therefore,itisthestatingpointofthispaperthatprobestheoptimizationmethodwhichisbasedontrueBPneuralnetwork.
AnunconstraintoptimizationmethodispresentedinthisresearchtosolvetheblackboxproblembasedonBPneuralnetworktoobtaintheexperimentalandobserveddatawithoutknowingthefunctionalrelationbetweeninputandoutputandtoachievetheoutputoptimizationbyadjustingtheinputvalues.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,thefittingandoptimizationofBPneuralnetworkiscombined.
ThecombinationexpandstheapplicationdomainofBPneuralnetworkandsolvestheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inaddition,constrainedoptimizationproblemsbasedonBPneuralnetworkwillbediscussedinfurtherstudies.
Thispapercanbedividedintofiveparts.
Inthefirstpart,theaimandsignificanceoftheresearchandthestatusquoisdiscussed,basedontheBPneuralnetwork'sadaptivecharacteristics.
ThesecondpartismainlyabouttheBPneuralnetwork'sstructureanditsalgorithm.
TheunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiveninthethirdpart.
Anillustrationoftheresearchisgiveninthefollowingpart.
Finally,theachievementoftheresearchisdemonstrated.
2.
BPneuralnetworkstructureanditsalgorithm2.
1.
BPneuralnetworkstructure.
BPneuralisamulti-layerforward-network.
Thethree-tiernetworkstructureisusuallyused[20].
Itsnetworkstructureshowsasfollows.
(Figure1).
Figure1Three-layersBPnetworkstructure2.
2BPneuralnetworkalgorithm.
(1)Forwardpropagationprocess.
Inputsignalstartingfromtheinputlayerpassesthroughthehiddenlayerunitsaretransmittedtotheoutputlayer,andfinallytheoutputsignalisgeneratedattheoutputlayer.
Iftheoutputsignalsmeetthegivenoutputrequirement,thecalculationisterminated.
Iftheoutputsignalsdoesnotmatchthegivenoutputrequirement,thesignalswillbetransferredtotheerrorsignalback-propagation.
Theforwardpropagationprocessiscalculatedasfollows:Supposetheinputlayerhasq+1inputsignals,anyoftheinputsignalsisexpressedwithi.
Thehiddenlayerhasp+1neuronsandanyoftheneuronsisexpressedwithj.
TheoutputlayerhasOoutputneuronsandanyoftheneuronsisexpressedwithk.
Theweightoftheinputlayerandthehiddenlayerisexpressedwithvij(i=0,1,2,…,q;j=1,2,…,p)andv0jisthehiddenlayerthresholdvalue.
Theweightofthehiddenlayerandoutputlayerisexpressedwithujk(j=0,1,2,…,p;k=1,2,…,o)andu0kistheoutputlayerthresholdvalue.
Supposehiddeninputisthenetj(j=1,2,…,p),thehiddenlayeroutputisyj(j=1,2,…,p),theinputofoutputlayerisnetk(k=1,2,…,o),andtheoutputofoutputlayeriszk(k=1,2,…,o).
SupposethetrainingsamplesetisexpressedasX=[X1,X2,…,Xr,…,Xn]correspondtoanyoftrainingsamplesasXr=[xr0,xr1,xr2,…,xrq](r=1,2,…,n,xr0=-1).
Theactualoutputanddesiredoutputareexpressedaszr=[zr1,zr2,…,zro]Tanddr=[dr1,dr2,…,dro]Trespectively.
Whenweletmbetheiterationnumber,theweightandtheactualoutputarethefunctionofm.
Forsignalforwardpropagationprocess,ifletXrbenetworkinputtrainingsamples,thenwehave:(1)(2)(3)(4)Foraboveequations,the-thneuronserrorsignaloftheoutputlayerisandthekneuronserrorenergyisdefinedas.
Thesumoftheerrorenergyoftheallneuronsofoutputlayeris:(5)If(eexpressesexpectedcalculatingaccuracy),thecalculationwillbefinished,otherwise,theback-propagationcomputingwillbecarriedout.
(2)Errorback-propagationprocess.
Theerrorsignalisthemarginbetweentheactualnetworkoutputandthedesiredoutput.
Errorsignaloutputpointstartsforwardpropagationlayerbylayer,whichistheback-propagationoftheerrorsignal.
Intheerrorsignalback-propagationprocess,thenetworkweightisadjustedbytheerrorfeedback.
Throughtherepeatingmodificationoftheweighttheactualnetworkoutputgraduallyapproachesthedesiredoutput.
Errorback-propagationprocessiscalculatedasfollows:(6)(7)(8)(9)Whereisthelearningrateandisagivenconstant.
Aftercalculationofthenewweightofvariouslayers,thecalculationisturnedtotheforwardpropagationprocess.
3.
TheunconstraintoptimizationmethodbasedonBPNeuralNetwork3.
1.
Mathematicalmodel.
Since,themaximumnetworkoutputisusedasanexampleforconveniencetoillustratetheprobleminthispaper.
WhenletF(X)betherelationbetweeninputandoutput,amathematicalmodelofunconstraintoptimizationbasedonBPneuralnetworkcanbeexpressedasfollows:(10)WhereXistheinputvector,X=(x1,x2,…,xq)TandZistheoutputofBPneuralnetwork.
3.
2.
Basicideas.
ThegradientofoutputintheX(0)pointiscalculatedfirstbasedonartificially-selectedorrandomly-selectedinitialpointX(0).
IfthegradientofX(0)pointisnot0,itmustbepossibletofindanewpointX(1)thatisbetterthanX(0)inthedirectionoftheX(0)pointgradient,anditispossibletosolvethegradientofX(1)point.
IfthegradientofX(1)pointisnot0,anewpointX(2)thatisbetterthanX(1)inthedirectionoftheX(1)pointgradientwillbecalculated.
Thisprocesscontinuesuntilthegradient0isobtainedorabetterpointcannotbefound(atthispoint,theproductofgradientandstepsizeislessthanorequaltothethresholdvalueofcomputer.
Atthispoint,theXvalueistheoptimalinputandthecorrespondingnetworkoutputistheoptimaloutput.
3.
3.
Thepartialderivativesofthenetworkofoutputtoinput.
AsthegradientvectoroffunctionF(X)is(11)Thus,aslongasthepartialderivativeoffunctionF(X)iscalculated,thegradientoffunctionF(X)canbeobtained.
ThefollowingistheproceduretoderivethepartialderivativeoftheBPneuralnetworkoutputversusitsinput.
BPneuralnetworktransferfunctionisgenerallyunipolarSigmoidfunctionexpressedasequation(12)(12)TakethistransferfunctionasanexampletoderivethepartialderivativesoftheBPneuralnetwork'soutputversusinput.
Sincethederivativeoff(x)is(13)(k=1,2,…,o;i=1,2,…,q;j=1,2,…,p)(14)(15)(16)(17)(18)So(19)Iflet(20)(21)Thenwehave(22)3.
4.
Theunconstraintoptimizationmethod.
IfX(0)isartificiallyselectedorrandomlyselectedastheinitialpointX(0)andX(m)issupposedtobegradientobtainedatpointofmthiterations,thenthegradientofX(m)canbecalculatedbyequation(23)(23)IfX(m)meetstheiterationterminationconditiongivenby,(24)Whereisthestepfactorand,andisthepre-specifiedprecisionvalue.
TheoptimalsolutionX*=X(m)(25)CorrespondingZ*isoptimalvalue.
IfZ*doesnotsatisfyequation(24),thenlet(26)(27)TheadjustmentmethodforX(m+1)isdescribedinthefollowingtwocases:Case1:X(m+1)isnotsuperiortoX(m)Let(28)Accordingtotheequation(26),equation(27),anewX(m+1)valueisobtainedbyrecalculating,thenitcanbejudgedwhetherX(m+1)issuperiortoX(m),namelywhetheritsatisfiesequation(29)ornot.
(29)IfX(m+1)satisfiestheequation(29),thenextiterationbegins.
IfX(m+1)doesnotmeettheequation(29),equation(28)isusedtoreducethestepsize.
Ifequations(26)and(27)areusedtorecalculate,anewX(m+1)isobtained,thendeterminewhetheritsatisfiesequation(29).
Ifnot,thenthesestepsarerepeateduntilthenewcalculatedX(m+1)issuperiortoX(m),thisiterationwillcometotheend.
Case2:X(m+1)issuperiortoX(m)Let(30)andanewX(m+1)valueisobtainedbyrecalculatingwithequation(26)and(27),thendetermineifX(m+1)issuperiortoX(m)andsatisfiesequation(29).
Ifnot,let(31)inthesametimethestepsizeisreducedbyhalfandthisiterationisdown.
IfthenewcalculationofX(m+1)satisfiesequation(29),thenassigntheX(m+1)valuestoX(m)byusingequation(32).
(32)Atthistime,stepsizeiscontinuetobeincreasedusingequation(30),andisrecalculatedbyequations(26)and(27)untiltheX(m+1)valuewhichisnotsuperiortoX(m)isobtained,thenX(m+1)isreplacedbyX(m).
Inthesametime,thestepsizeisreducedbyhalfandthisiterationisover.
Aftercompletionofeachiterations,ithastobejudgedifresultsmeettheiterationterminationcondition(iftheresultsmeetequation(24)).
Ifitsatisfiesequation(24),thenletX*=X(m+1)(33)WhereX*istheoptimalsolutionandthecorrespondingZ*istherequiredoptimalvalue.
Ifitdoesnotsatisfyequation(24),thenextiterationbeginsuntilX(m)valuessatisfyequation(24),thentheoptimizationisover.
4.
DemonstrationcalculationBecauseoftheoptimalvalueoftheblackboxproblemisunknown,itisdifficulttovalidatetheaccuracyandstabilityofoptimizationmethod.
Toovercomethisproblem,weselecttwoknownfunctionsfordiscretizationandthenusethediscretedsamplestoconductBPneuralnetworkfittingtraining.
Theoptimalvalueofnetworkoutputisobtainedthroughthefittingtrainingsothattheoptimizedresultscanbecomparedwiththetheoreticaloptimalvalue.
4.
1Example1.
Letequation(34)beknownfunction,(34)thenthetheoreticalmaximumvalueofthisfunctionismaxF(X)=maxF(57.
625,51.
136,1)=2045.
412.
Inthisexample,BPneuralnetworkwillbeusedtomakefunctionfittingandthenetworkoutputmaximumafterfittingcomesout.
ThefirststepisdiscretizationofthefunctionF(X).
Sixpointswithequalintervalsareselectedforx1,x2,andx3intherangeof30to80,25to75,and-10to15respectively,atotalof216points.
ThecorrespondingvaluesofF(x)arecalculatedinequation34andthenBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis3-25-1.
Whenthenetworkmatchesthepre-specifiedprecisione=10-4,thenetworkweightandthresholdarekept.
Underthiscondition,theaveragerelativeerrorofthenetworkfittingis0.
002854%.
Second,themaxF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthisarticle.
Table1showsthesamemax(F)valuescalculatedwithtendifferentinitialpointsatε=0.
InTable1,istheaveragevalueofoptimizedresultsfor10timesandβisthestabilityindicatorformeasuringoptimizedresults(equation35).
(35)Table1CalculationresultsofF(x)valuesNum.
12345678910X(0)3040506070803040508025354555657545657535-10-5051015510010x157.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
697x251.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
167x31.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
000maxF2045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
329βi11111111114.
2Example2.
Letequation(36)beknownfunctionF(x)=x2-20x+205(36)TheminimumvalueofthefunctionisminF(x)=minF(10)=105.
ThefollowingistheprocessofachievingfunctionfittingandseekingthenetworkminimumoutputafterfunctionfittingwithBPneuralnetwork.
First,turningtheminimumF(x)intomaxF(x)usingequation(37)maxF(x)=min[-F(x)](37)AndthendiscretethefunctionF(x),inwhich81pointsistakeninrangeof0to20at0.
25interval.
ThenthecorrespondingvaluesofF(x)arecalculatedusingequation(37)andBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis1-10-1.
Whenthenetworkmeetsthepre-specifiedprecisione=10-5,thenetworkweightandthresholdarekept,thenatthispoint,theaveragerelativeerrorofthenetworkfittingis0.
1004%.
ThenminF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthispaper.
Table2showstheresultsthatarecalculatedwithtendifferentinitialpointsatε=0.
Intable2,istheaveragevalueofoptimizationresultsfor10times.
Table2CalculationresultsofF(x)Num.
12345678910X(0)02.
254.
757.
259.
7512.
2514.
7517.
2519.
7520x9.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
971minF105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003βi1111111111Table1andTable2showthattheresultfromoptimizedmethodisverystableandtheoptimalvalueofF(x)isveryclosetothetheoreticaloptimalvalue.
Inexample1,theaveragerelativeerrorsofx1,x2,andx3are0.
125%,0.
061%,and0%respectively.
ThemaximumfunctionvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00406%.
Inexample2,theaveragerelativeerrorofxis0.
29%.
ThemaximumvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00286%.
Inaddition,theseerrorsalsoincludethefittingerrors.
Theresultsindicatethattheaccuracyofnetworkoptimizationisrelativelyhigh.
5.
Conclusion.
(1)AnunconstrainedoptimizationmethodisproposedbasedonBPneuralnetwork,whichispowerfultosolvetheblackboxproblem.
(2)WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
(3)Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderived(4)Thestepsizeofoptimizationmethodpresentedinthispaperhasinheritance,whichacceleratestheoptimizationspeed.
(5)Themodelisvalidatedbydemonstrationofsamplecalculationandresultsshowthatthealgorithmisaneffectivemethod.
AcknowledgmentTheresearchissupportedbyNationalNaturalScienceFoundationofChinaandNationalHighTechnologyResearchandDevelopmentProgramofChina(GrantNo.
31071331、2006AA10A310-1)REFERENCES[1]YUWei-ping,PENGYi-gong.
IntelligentControlTechnologyResearch[C].
EighthConferenceonIndustrialInstrumentationandAutomationAcademicMeetingpaper,2007,415-418[2]LIShu-rong,YANGQing,GUOShu-hui.
Neuralnetworkbasedadaptivecontrolforaclassofnonaffinenonlinearsystems[J].
SystemsScienceandMathematics,2007,27(2):161-169[3]CHENMing-jie,NIJin-ren,CHAKe-maietc.
Applicationofgeneticalgorithm-basedartificialneuralnetworksin2Dtidalflowsimulation[J].
JournalofHydraulicEngineering,2003,(10):1-12[4]ZHOULing,SUNJun,YUANYu-bo.
EffectsofcombinedactivationfunctiononBPalgorithm'sconvergencespeed[J].
JournalofHohaiUniversity,1999,27(5):107-108.
[5]TANGWan-mei.
ThestudyoftheoptimalstructureofBPneuralnetwork[J].
SystemsEngineeringTheory&Practice,2005,(10):95-100[6]FunahashiK.
Ontheapproximaterealizationofcontinuousmappingsbyneuralnetworks[J].
NeuralNetworks,1989,2(7):183-192[7]Hecht-NielsonR.
Theoryofthebackpropagationneuralnetworks[M].
WashingtonD.
C.
ProceedingsofIEEEinternationalJointconferenceonNeuralNetworks.
1989[8]Zhang,Y.
,WuL.
.
WeightsoptimizationofneuralnetworkviaimprovedBCOapproach[J].
ProgressInElectromagneticResearch,PIER83,185-198,2008[9]WANGWen-jian.
TheoptimizationofBPneuralnetworks[J].
ComputerEngineeringandDesign,2000,21(6):8-10[10]ZHANGShan,HEJiannong.
ResearchonOptimizedAlgorithmforBPNeuralNetworks[J].
ComputerandModernization,2009,(1):73-80[11]XingHihua,LinHngyan,ChenHuandong,etal.
SensitivityanalysisofBPneuralnetworkoptimizedbygeneticalgorithmanditsapplicationstofeaturereduction[J].
InternationalReviewonComputersandSoftware.
2012,7(6):3084-3089.
[12]ChunshengDong,Liudong,MingmingYang.
TheApplicationoftheBPNeuralNetworkintheNonlinearOptimization.
AdvancesinIntelligentandSoftComputing[J],2010,78:727-732[13]ShifeiDing,ChunyangSu,JunzhaoYu.
AnoptimizingBPneuralnetworkalgorithmbasedongeneticalgorithm[J].
ArtificialIntelligenceReview,2011,36(2):153-162.
[14]LiSong,LiuLijun,ZhaiMan.
Predictionforshort-termtrafficflowbasedonmodifiedPSOoptimizedBPneuralnetwork[J].
SystemsEngineering-Theory&Practice,2012.
39(9):2045-2049.
[15]XingHihua,LinHngyan.
AnintelligentmethodoptimizingBPneuralnetworkmodel[C].
2ndInternationalConferenceonMaterialsandProductsManufacturingTechnology,ICMPMT2012,2012,2470-2474.
[16]Merad,L.
,Bendimerad,F.
T.
,Meriah,S.
M.
,etal.
NeuralNetworksforsynthesisandoptimizationofantennasarrays[J].
RadioengineeringJournal,2007,16(1):23-30[17]GulatiT.
,ChakrabartiM.
,SinghA.
,etal.
ComparativeStudyofResponseSurfaceMethodology,ArtificialNeuralNetworkandGeneticAlgorithmsforOptimizationofSoybeanHydration[J].
FoodTechnolBiotechnol,2010,1(48):11-18[18]WANGXin-min,ZHAOBin,WANGXian-lai.
Optimizationofdrillingandblastingparametersbasedonback-propagationneuralnetwork[J].
JournalofCentralSouthUniversity(NaturalScience),2009,40(5):1411-1416[19]LIULei.
Indextrackingoptimizationmethodbasedongeneticneuralnetwork[J].
SystemsEngineeringTheory&Practice,2010,30(1):22-29[20]HANLi-qun.
Artificialneuralnetworktutorial[M].
BeijingUniversityofPostsandTelecommunicationsPress,2006,12

水墨云历史黑名单IDC,斟酌选购

水墨云怎么样?本站黑名单idc,有被删除账号风险,建议转出及数据备份!水墨云ink cloud Service是成立于2017年的商家,自2020起开始从事香港、日本、韩国、美国等地区CN2 GIA线路的虚拟服务器租赁,同时还有台湾、国内nat vps相关业务,也有iplc专线产品,相对来说主打的是大带宽服务器产品。注意:本站黑名单IDC,有被删除账号风险,请尽量避免,如果已经购买建议转出及数据备...

舍利云30元/月起;美国CERA云服务器,原生ip,低至28元/月起

目前舍利云服务器的主要特色是适合seo和建站,性价比方面非常不错,舍利云的产品以BGP线路速度优质稳定而著称,对于产品的线路和带宽有着极其严格的讲究,这主要表现在其对母鸡的超售有严格的管控,与此同时舍利云也尽心尽力为用户提供完美服务。目前,香港cn2云服务器,5M/10M带宽,价格低至30元/月,可试用1天;;美国cera云服务器,原生ip,低至28元/月起。一、香港CN2云服务器香港CN2精品线...

天上云月付572元,起香港三网CN2直连,独立服务器88折优惠,香港沙田机房

天上云怎么样?天上云隶属于成都天上云网络科技有限公司,是一家提供云服务器及物理服务器的国人商家,目前商家针对香港物理机在做优惠促销,香港沙田机房采用三网直连,其中电信走CN2,带宽为50Mbps,不限制流量,商家提供IPMI,可以自行管理,随意安装系统,目前E3-1225/16G的套餐低至572元每月,有做大规模业务的朋友可以看看。点击进入:天上云官方网站天上云香港物理机服务器套餐:香港沙田数据中...

雅虎免费邮箱为你推荐
ovOV摄像头是哪个国家的郭吉军二战中受害最大的国家?刷网站权重刷出来的流量会提高网站的权重吗?sourcegear请问高手这是什么“dynamsoft sourceanywhere for vss”,做项目的时候用的,我是新手不知道这是干什么。硬盘人上海人说“硬盘”是什么梗qq怎么发邮件怎么发送QQ邮件机械键盘轴机械键盘什么轴好,机械键盘轴有几种网页打开很慢为什么我打开浏览器的时候,网页打开的很慢?怎么上传音乐怎么上传音乐到网上宽带接入服务器宽带接入服务器的五大功能是什么?
政务和公益机构域名注册管理中心 vps虚拟服务器 已备案域名出售 virpus blackfriday 全能主机 租空间 免费smtp服务器 699美元 赞助 hinet 电信虚拟主机 100mbps paypal注册教程 如何建立邮箱 德讯 防cc攻击 广东服务器托管 服务器托管价格 globalsign 更多