tier雅虎免费邮箱
雅虎免费邮箱 时间:2021-02-24 阅读:(
)
ICICExpressLettersICICInternational2010ISSN1881-803XVolume4,Number5,October2010pp.
1–9AnUnconstrainedOptimizationMethodBasedonBPNeuralNetworkFulinWang*,HuixiaZhu,JiquanWangCollegeofEngineeringNortheastAgriculturalUniversity,Harbin150030,ChinaEmail:fulinwang@yahoo.
com.
cnABSTRACT.
Anunconstrainedoptimizationmethodisproposedinthispaper,basedonbackpropagation(BP)neuralnetwork.
Theoptimizationmethodismainlyappliedtosolvingtheblackboxproblem.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderivedandanalgorithmbasedontheunconstrainedoptimizationmethodofBpneuralnetworkisproposed.
Themodelisvalidatedbydemonstrationofsamplecalculationandtheresultsshowthatthealgorithmisaneffectivemethod.
Keywords:BPneuralnetwork;Unconstraint;model;optimizationmethod1.
Introduction.
BackPropagation(BP)neuralnetworkisoneoftheimportantresearchfieldsinintelligentcontrolandintelligentautomation[1,2].
BPneuralnetworkiscomposedofmanysimpleparallelalgorithmmodules,whicharesimilarwithbiologicalneuralsystemneurons.
Neuralnetworkisanonlineardynamicsystemwhichischaracterizedbydistributedinformationstorageandparallelsynergistictreatment[3].
Thestructureofasingleneuronissimpleanditsfunctionislimited.
However,anetworksystemwhichcontainsalargenumberofneuronshasvariousfunctionsandcanbeusedinmanyapplications.
BPneuralnetworkmodelwhichisoneofthemostimportantartificialneuralnetworkmodelsisamulti-layerforward-neuralnetwork,whichismostwidelystudiedandusedatpresent.
Theorieshaveprovedthatifathree-layerBPneuralnetworkhasenoughhiddenlayernodes,itcansimulateanycomplexnonlinearmapping[4-7].
ThisindicatesthatBPneuralnetworkfitsrathereasily.
TheoptimizationresearchofBPneuralnetworkisrecordedinsomedocuments.
Peoplehavedonealotofstudiesinsuchaspectsaslearningrate,weight,threshold,andnetworkstructureoptimizationetc,inordertosolvetheproblemsofBPneuralnetworkwhichincludethefactorsofthefluctuations,theoscillation,theslowingoffittingspeed,theinfinitenetworkstructureandsoon[8-15].
Butintheactualsituation,peoplearenotonlyconcernedaboutthefittingeffectofneuralnetworks,butalsoabouthowtoachievethemaximumandminimumoutputvaluesbyadjustinginputvalues.
ThepresentliteratureaboutBPneuralnetworkoptimizationismainlyaboutidentifyingthecorrespondingrelationbetweenBPneuralnetworkinputandoutputtoobtaintherequiredoutputvalues[16-19].
Actuallythesestudiescanbeconsideredassimulationbutnotoptimizationbecausetheselectedoptimumprogramisbasedonthesimulationresults.
Therefore,itisthestatingpointofthispaperthatprobestheoptimizationmethodwhichisbasedontrueBPneuralnetwork.
AnunconstraintoptimizationmethodispresentedinthisresearchtosolvetheblackboxproblembasedonBPneuralnetworktoobtaintheexperimentalandobserveddatawithoutknowingthefunctionalrelationbetweeninputandoutputandtoachievetheoutputoptimizationbyadjustingtheinputvalues.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,thefittingandoptimizationofBPneuralnetworkiscombined.
ThecombinationexpandstheapplicationdomainofBPneuralnetworkandsolvestheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inaddition,constrainedoptimizationproblemsbasedonBPneuralnetworkwillbediscussedinfurtherstudies.
Thispapercanbedividedintofiveparts.
Inthefirstpart,theaimandsignificanceoftheresearchandthestatusquoisdiscussed,basedontheBPneuralnetwork'sadaptivecharacteristics.
ThesecondpartismainlyabouttheBPneuralnetwork'sstructureanditsalgorithm.
TheunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiveninthethirdpart.
Anillustrationoftheresearchisgiveninthefollowingpart.
Finally,theachievementoftheresearchisdemonstrated.
2.
BPneuralnetworkstructureanditsalgorithm2.
1.
BPneuralnetworkstructure.
BPneuralisamulti-layerforward-network.
Thethree-tiernetworkstructureisusuallyused[20].
Itsnetworkstructureshowsasfollows.
(Figure1).
Figure1Three-layersBPnetworkstructure2.
2BPneuralnetworkalgorithm.
(1)Forwardpropagationprocess.
Inputsignalstartingfromtheinputlayerpassesthroughthehiddenlayerunitsaretransmittedtotheoutputlayer,andfinallytheoutputsignalisgeneratedattheoutputlayer.
Iftheoutputsignalsmeetthegivenoutputrequirement,thecalculationisterminated.
Iftheoutputsignalsdoesnotmatchthegivenoutputrequirement,thesignalswillbetransferredtotheerrorsignalback-propagation.
Theforwardpropagationprocessiscalculatedasfollows:Supposetheinputlayerhasq+1inputsignals,anyoftheinputsignalsisexpressedwithi.
Thehiddenlayerhasp+1neuronsandanyoftheneuronsisexpressedwithj.
TheoutputlayerhasOoutputneuronsandanyoftheneuronsisexpressedwithk.
Theweightoftheinputlayerandthehiddenlayerisexpressedwithvij(i=0,1,2,…,q;j=1,2,…,p)andv0jisthehiddenlayerthresholdvalue.
Theweightofthehiddenlayerandoutputlayerisexpressedwithujk(j=0,1,2,…,p;k=1,2,…,o)andu0kistheoutputlayerthresholdvalue.
Supposehiddeninputisthenetj(j=1,2,…,p),thehiddenlayeroutputisyj(j=1,2,…,p),theinputofoutputlayerisnetk(k=1,2,…,o),andtheoutputofoutputlayeriszk(k=1,2,…,o).
SupposethetrainingsamplesetisexpressedasX=[X1,X2,…,Xr,…,Xn]correspondtoanyoftrainingsamplesasXr=[xr0,xr1,xr2,…,xrq](r=1,2,…,n,xr0=-1).
Theactualoutputanddesiredoutputareexpressedaszr=[zr1,zr2,…,zro]Tanddr=[dr1,dr2,…,dro]Trespectively.
Whenweletmbetheiterationnumber,theweightandtheactualoutputarethefunctionofm.
Forsignalforwardpropagationprocess,ifletXrbenetworkinputtrainingsamples,thenwehave:(1)(2)(3)(4)Foraboveequations,the-thneuronserrorsignaloftheoutputlayerisandthekneuronserrorenergyisdefinedas.
Thesumoftheerrorenergyoftheallneuronsofoutputlayeris:(5)If(eexpressesexpectedcalculatingaccuracy),thecalculationwillbefinished,otherwise,theback-propagationcomputingwillbecarriedout.
(2)Errorback-propagationprocess.
Theerrorsignalisthemarginbetweentheactualnetworkoutputandthedesiredoutput.
Errorsignaloutputpointstartsforwardpropagationlayerbylayer,whichistheback-propagationoftheerrorsignal.
Intheerrorsignalback-propagationprocess,thenetworkweightisadjustedbytheerrorfeedback.
Throughtherepeatingmodificationoftheweighttheactualnetworkoutputgraduallyapproachesthedesiredoutput.
Errorback-propagationprocessiscalculatedasfollows:(6)(7)(8)(9)Whereisthelearningrateandisagivenconstant.
Aftercalculationofthenewweightofvariouslayers,thecalculationisturnedtotheforwardpropagationprocess.
3.
TheunconstraintoptimizationmethodbasedonBPNeuralNetwork3.
1.
Mathematicalmodel.
Since,themaximumnetworkoutputisusedasanexampleforconveniencetoillustratetheprobleminthispaper.
WhenletF(X)betherelationbetweeninputandoutput,amathematicalmodelofunconstraintoptimizationbasedonBPneuralnetworkcanbeexpressedasfollows:(10)WhereXistheinputvector,X=(x1,x2,…,xq)TandZistheoutputofBPneuralnetwork.
3.
2.
Basicideas.
ThegradientofoutputintheX(0)pointiscalculatedfirstbasedonartificially-selectedorrandomly-selectedinitialpointX(0).
IfthegradientofX(0)pointisnot0,itmustbepossibletofindanewpointX(1)thatisbetterthanX(0)inthedirectionoftheX(0)pointgradient,anditispossibletosolvethegradientofX(1)point.
IfthegradientofX(1)pointisnot0,anewpointX(2)thatisbetterthanX(1)inthedirectionoftheX(1)pointgradientwillbecalculated.
Thisprocesscontinuesuntilthegradient0isobtainedorabetterpointcannotbefound(atthispoint,theproductofgradientandstepsizeislessthanorequaltothethresholdvalueofcomputer.
Atthispoint,theXvalueistheoptimalinputandthecorrespondingnetworkoutputistheoptimaloutput.
3.
3.
Thepartialderivativesofthenetworkofoutputtoinput.
AsthegradientvectoroffunctionF(X)is(11)Thus,aslongasthepartialderivativeoffunctionF(X)iscalculated,thegradientoffunctionF(X)canbeobtained.
ThefollowingistheproceduretoderivethepartialderivativeoftheBPneuralnetworkoutputversusitsinput.
BPneuralnetworktransferfunctionisgenerallyunipolarSigmoidfunctionexpressedasequation(12)(12)TakethistransferfunctionasanexampletoderivethepartialderivativesoftheBPneuralnetwork'soutputversusinput.
Sincethederivativeoff(x)is(13)(k=1,2,…,o;i=1,2,…,q;j=1,2,…,p)(14)(15)(16)(17)(18)So(19)Iflet(20)(21)Thenwehave(22)3.
4.
Theunconstraintoptimizationmethod.
IfX(0)isartificiallyselectedorrandomlyselectedastheinitialpointX(0)andX(m)issupposedtobegradientobtainedatpointofmthiterations,thenthegradientofX(m)canbecalculatedbyequation(23)(23)IfX(m)meetstheiterationterminationconditiongivenby,(24)Whereisthestepfactorand,andisthepre-specifiedprecisionvalue.
TheoptimalsolutionX*=X(m)(25)CorrespondingZ*isoptimalvalue.
IfZ*doesnotsatisfyequation(24),thenlet(26)(27)TheadjustmentmethodforX(m+1)isdescribedinthefollowingtwocases:Case1:X(m+1)isnotsuperiortoX(m)Let(28)Accordingtotheequation(26),equation(27),anewX(m+1)valueisobtainedbyrecalculating,thenitcanbejudgedwhetherX(m+1)issuperiortoX(m),namelywhetheritsatisfiesequation(29)ornot.
(29)IfX(m+1)satisfiestheequation(29),thenextiterationbegins.
IfX(m+1)doesnotmeettheequation(29),equation(28)isusedtoreducethestepsize.
Ifequations(26)and(27)areusedtorecalculate,anewX(m+1)isobtained,thendeterminewhetheritsatisfiesequation(29).
Ifnot,thenthesestepsarerepeateduntilthenewcalculatedX(m+1)issuperiortoX(m),thisiterationwillcometotheend.
Case2:X(m+1)issuperiortoX(m)Let(30)andanewX(m+1)valueisobtainedbyrecalculatingwithequation(26)and(27),thendetermineifX(m+1)issuperiortoX(m)andsatisfiesequation(29).
Ifnot,let(31)inthesametimethestepsizeisreducedbyhalfandthisiterationisdown.
IfthenewcalculationofX(m+1)satisfiesequation(29),thenassigntheX(m+1)valuestoX(m)byusingequation(32).
(32)Atthistime,stepsizeiscontinuetobeincreasedusingequation(30),andisrecalculatedbyequations(26)and(27)untiltheX(m+1)valuewhichisnotsuperiortoX(m)isobtained,thenX(m+1)isreplacedbyX(m).
Inthesametime,thestepsizeisreducedbyhalfandthisiterationisover.
Aftercompletionofeachiterations,ithastobejudgedifresultsmeettheiterationterminationcondition(iftheresultsmeetequation(24)).
Ifitsatisfiesequation(24),thenletX*=X(m+1)(33)WhereX*istheoptimalsolutionandthecorrespondingZ*istherequiredoptimalvalue.
Ifitdoesnotsatisfyequation(24),thenextiterationbeginsuntilX(m)valuessatisfyequation(24),thentheoptimizationisover.
4.
DemonstrationcalculationBecauseoftheoptimalvalueoftheblackboxproblemisunknown,itisdifficulttovalidatetheaccuracyandstabilityofoptimizationmethod.
Toovercomethisproblem,weselecttwoknownfunctionsfordiscretizationandthenusethediscretedsamplestoconductBPneuralnetworkfittingtraining.
Theoptimalvalueofnetworkoutputisobtainedthroughthefittingtrainingsothattheoptimizedresultscanbecomparedwiththetheoreticaloptimalvalue.
4.
1Example1.
Letequation(34)beknownfunction,(34)thenthetheoreticalmaximumvalueofthisfunctionismaxF(X)=maxF(57.
625,51.
136,1)=2045.
412.
Inthisexample,BPneuralnetworkwillbeusedtomakefunctionfittingandthenetworkoutputmaximumafterfittingcomesout.
ThefirststepisdiscretizationofthefunctionF(X).
Sixpointswithequalintervalsareselectedforx1,x2,andx3intherangeof30to80,25to75,and-10to15respectively,atotalof216points.
ThecorrespondingvaluesofF(x)arecalculatedinequation34andthenBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis3-25-1.
Whenthenetworkmatchesthepre-specifiedprecisione=10-4,thenetworkweightandthresholdarekept.
Underthiscondition,theaveragerelativeerrorofthenetworkfittingis0.
002854%.
Second,themaxF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthisarticle.
Table1showsthesamemax(F)valuescalculatedwithtendifferentinitialpointsatε=0.
InTable1,istheaveragevalueofoptimizedresultsfor10timesandβisthestabilityindicatorformeasuringoptimizedresults(equation35).
(35)Table1CalculationresultsofF(x)valuesNum.
12345678910X(0)3040506070803040508025354555657545657535-10-5051015510010x157.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
697x251.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
167x31.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
000maxF2045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
329βi11111111114.
2Example2.
Letequation(36)beknownfunctionF(x)=x2-20x+205(36)TheminimumvalueofthefunctionisminF(x)=minF(10)=105.
ThefollowingistheprocessofachievingfunctionfittingandseekingthenetworkminimumoutputafterfunctionfittingwithBPneuralnetwork.
First,turningtheminimumF(x)intomaxF(x)usingequation(37)maxF(x)=min[-F(x)](37)AndthendiscretethefunctionF(x),inwhich81pointsistakeninrangeof0to20at0.
25interval.
ThenthecorrespondingvaluesofF(x)arecalculatedusingequation(37)andBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis1-10-1.
Whenthenetworkmeetsthepre-specifiedprecisione=10-5,thenetworkweightandthresholdarekept,thenatthispoint,theaveragerelativeerrorofthenetworkfittingis0.
1004%.
ThenminF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthispaper.
Table2showstheresultsthatarecalculatedwithtendifferentinitialpointsatε=0.
Intable2,istheaveragevalueofoptimizationresultsfor10times.
Table2CalculationresultsofF(x)Num.
12345678910X(0)02.
254.
757.
259.
7512.
2514.
7517.
2519.
7520x9.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
971minF105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003βi1111111111Table1andTable2showthattheresultfromoptimizedmethodisverystableandtheoptimalvalueofF(x)isveryclosetothetheoreticaloptimalvalue.
Inexample1,theaveragerelativeerrorsofx1,x2,andx3are0.
125%,0.
061%,and0%respectively.
ThemaximumfunctionvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00406%.
Inexample2,theaveragerelativeerrorofxis0.
29%.
ThemaximumvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00286%.
Inaddition,theseerrorsalsoincludethefittingerrors.
Theresultsindicatethattheaccuracyofnetworkoptimizationisrelativelyhigh.
5.
Conclusion.
(1)AnunconstrainedoptimizationmethodisproposedbasedonBPneuralnetwork,whichispowerfultosolvetheblackboxproblem.
(2)WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
(3)Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderived(4)Thestepsizeofoptimizationmethodpresentedinthispaperhasinheritance,whichacceleratestheoptimizationspeed.
(5)Themodelisvalidatedbydemonstrationofsamplecalculationandresultsshowthatthealgorithmisaneffectivemethod.
AcknowledgmentTheresearchissupportedbyNationalNaturalScienceFoundationofChinaandNationalHighTechnologyResearchandDevelopmentProgramofChina(GrantNo.
31071331、2006AA10A310-1)REFERENCES[1]YUWei-ping,PENGYi-gong.
IntelligentControlTechnologyResearch[C].
EighthConferenceonIndustrialInstrumentationandAutomationAcademicMeetingpaper,2007,415-418[2]LIShu-rong,YANGQing,GUOShu-hui.
Neuralnetworkbasedadaptivecontrolforaclassofnonaffinenonlinearsystems[J].
SystemsScienceandMathematics,2007,27(2):161-169[3]CHENMing-jie,NIJin-ren,CHAKe-maietc.
Applicationofgeneticalgorithm-basedartificialneuralnetworksin2Dtidalflowsimulation[J].
JournalofHydraulicEngineering,2003,(10):1-12[4]ZHOULing,SUNJun,YUANYu-bo.
EffectsofcombinedactivationfunctiononBPalgorithm'sconvergencespeed[J].
JournalofHohaiUniversity,1999,27(5):107-108.
[5]TANGWan-mei.
ThestudyoftheoptimalstructureofBPneuralnetwork[J].
SystemsEngineeringTheory&Practice,2005,(10):95-100[6]FunahashiK.
Ontheapproximaterealizationofcontinuousmappingsbyneuralnetworks[J].
NeuralNetworks,1989,2(7):183-192[7]Hecht-NielsonR.
Theoryofthebackpropagationneuralnetworks[M].
WashingtonD.
C.
ProceedingsofIEEEinternationalJointconferenceonNeuralNetworks.
1989[8]Zhang,Y.
,WuL.
.
WeightsoptimizationofneuralnetworkviaimprovedBCOapproach[J].
ProgressInElectromagneticResearch,PIER83,185-198,2008[9]WANGWen-jian.
TheoptimizationofBPneuralnetworks[J].
ComputerEngineeringandDesign,2000,21(6):8-10[10]ZHANGShan,HEJiannong.
ResearchonOptimizedAlgorithmforBPNeuralNetworks[J].
ComputerandModernization,2009,(1):73-80[11]XingHihua,LinHngyan,ChenHuandong,etal.
SensitivityanalysisofBPneuralnetworkoptimizedbygeneticalgorithmanditsapplicationstofeaturereduction[J].
InternationalReviewonComputersandSoftware.
2012,7(6):3084-3089.
[12]ChunshengDong,Liudong,MingmingYang.
TheApplicationoftheBPNeuralNetworkintheNonlinearOptimization.
AdvancesinIntelligentandSoftComputing[J],2010,78:727-732[13]ShifeiDing,ChunyangSu,JunzhaoYu.
AnoptimizingBPneuralnetworkalgorithmbasedongeneticalgorithm[J].
ArtificialIntelligenceReview,2011,36(2):153-162.
[14]LiSong,LiuLijun,ZhaiMan.
Predictionforshort-termtrafficflowbasedonmodifiedPSOoptimizedBPneuralnetwork[J].
SystemsEngineering-Theory&Practice,2012.
39(9):2045-2049.
[15]XingHihua,LinHngyan.
AnintelligentmethodoptimizingBPneuralnetworkmodel[C].
2ndInternationalConferenceonMaterialsandProductsManufacturingTechnology,ICMPMT2012,2012,2470-2474.
[16]Merad,L.
,Bendimerad,F.
T.
,Meriah,S.
M.
,etal.
NeuralNetworksforsynthesisandoptimizationofantennasarrays[J].
RadioengineeringJournal,2007,16(1):23-30[17]GulatiT.
,ChakrabartiM.
,SinghA.
,etal.
ComparativeStudyofResponseSurfaceMethodology,ArtificialNeuralNetworkandGeneticAlgorithmsforOptimizationofSoybeanHydration[J].
FoodTechnolBiotechnol,2010,1(48):11-18[18]WANGXin-min,ZHAOBin,WANGXian-lai.
Optimizationofdrillingandblastingparametersbasedonback-propagationneuralnetwork[J].
JournalofCentralSouthUniversity(NaturalScience),2009,40(5):1411-1416[19]LIULei.
Indextrackingoptimizationmethodbasedongeneticneuralnetwork[J].
SystemsEngineeringTheory&Practice,2010,30(1):22-29[20]HANLi-qun.
Artificialneuralnetworktutorial[M].
BeijingUniversityofPostsandTelecommunicationsPress,2006,12
Sharktech 鲨鱼机房商家我们是不是算比较熟悉的,因为有很多的服务商渠道的高防服务器都是拿他们家的机器然后部署高防VPS主机的,不过这几年Sharktech商家有自己直接销售云服务器产品,比如看到有新增公有云主机有促销活动,一般有人可能买回去自己搭建虚拟主机拆分销售的,有的也是自用的。有看到不少网友在分享到鲨鱼机房商家促销活动期间,有赠送开通公有云主机$50,可以购买最低配置的,$49/月的...
Central美国独立日活动正在进行中,旗下美国达拉斯机房VPS 65折优惠,季付赠送双倍内存(需要发工单),Central租用的Hivelocity的机房,只支持信用卡和加密货币付款,不支持paypal,需要美国独服的可以谨慎入手试试。Central怎么样?Central便宜服务器,Central自称成立于2019年,主营美国达拉斯机房Linux vps、Windows vps、专用服务器和托管...
数脉科技怎么样?昨天看到数脉科技发布了7月优惠,如果你想购买香港服务器,可以看看他家的产品,性价比还是非常高的。数脉科技对香港自营机房的香港服务器进行超低价促销,可选择10M、30M的优质bgp网络。目前商家有优质BGP、CN2、阿里云线路,国内用户用来做站非常不错,目前E3/16GB阿里云CN2线路的套餐有一个立减400元的优惠,有需要的朋友可以看看。点击进入:数脉科技商家官方网站香港特价阿里云...
雅虎免费邮箱为你推荐
bbsxpdvbbs bbsxp LeadBBS 对比打开网页出现错误我打开网页老出现错误是怎么了?ghostxp3目前最好的ghost xp3是什么?吴晓波频道买粉看吴晓波频道的心得工信部备案去国家工信部备案需要什么手续呢如何建立自己的网站怎么创建自己的网站怎么升级ios6iPad怎么升级到iOS6正式版?ios系统ios是什么意思 ios系统是什么bluestack安卓模拟器bluestacks怎么用?网站营运网站运营要学些什么?
vps侦探 韩国俄罗斯 韩国空间 国外私服 xen 512m内存 windows2003iso 国内加速器 web服务器架设软件 gg广告 权嘉云 新家坡 世界测速 银盘服务是什么 申请网站 ledlamp 学生服务器 免费网络空间 存储服务器 海外加速 更多