tier雅虎免费邮箱

雅虎免费邮箱  时间:2021-02-24  阅读:()
ICICExpressLettersICICInternational2010ISSN1881-803XVolume4,Number5,October2010pp.
1–9AnUnconstrainedOptimizationMethodBasedonBPNeuralNetworkFulinWang*,HuixiaZhu,JiquanWangCollegeofEngineeringNortheastAgriculturalUniversity,Harbin150030,ChinaEmail:fulinwang@yahoo.
com.
cnABSTRACT.
Anunconstrainedoptimizationmethodisproposedinthispaper,basedonbackpropagation(BP)neuralnetwork.
Theoptimizationmethodismainlyappliedtosolvingtheblackboxproblem.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderivedandanalgorithmbasedontheunconstrainedoptimizationmethodofBpneuralnetworkisproposed.
Themodelisvalidatedbydemonstrationofsamplecalculationandtheresultsshowthatthealgorithmisaneffectivemethod.
Keywords:BPneuralnetwork;Unconstraint;model;optimizationmethod1.
Introduction.
BackPropagation(BP)neuralnetworkisoneoftheimportantresearchfieldsinintelligentcontrolandintelligentautomation[1,2].
BPneuralnetworkiscomposedofmanysimpleparallelalgorithmmodules,whicharesimilarwithbiologicalneuralsystemneurons.
Neuralnetworkisanonlineardynamicsystemwhichischaracterizedbydistributedinformationstorageandparallelsynergistictreatment[3].
Thestructureofasingleneuronissimpleanditsfunctionislimited.
However,anetworksystemwhichcontainsalargenumberofneuronshasvariousfunctionsandcanbeusedinmanyapplications.
BPneuralnetworkmodelwhichisoneofthemostimportantartificialneuralnetworkmodelsisamulti-layerforward-neuralnetwork,whichismostwidelystudiedandusedatpresent.
Theorieshaveprovedthatifathree-layerBPneuralnetworkhasenoughhiddenlayernodes,itcansimulateanycomplexnonlinearmapping[4-7].
ThisindicatesthatBPneuralnetworkfitsrathereasily.
TheoptimizationresearchofBPneuralnetworkisrecordedinsomedocuments.
Peoplehavedonealotofstudiesinsuchaspectsaslearningrate,weight,threshold,andnetworkstructureoptimizationetc,inordertosolvetheproblemsofBPneuralnetworkwhichincludethefactorsofthefluctuations,theoscillation,theslowingoffittingspeed,theinfinitenetworkstructureandsoon[8-15].
Butintheactualsituation,peoplearenotonlyconcernedaboutthefittingeffectofneuralnetworks,butalsoabouthowtoachievethemaximumandminimumoutputvaluesbyadjustinginputvalues.
ThepresentliteratureaboutBPneuralnetworkoptimizationismainlyaboutidentifyingthecorrespondingrelationbetweenBPneuralnetworkinputandoutputtoobtaintherequiredoutputvalues[16-19].
Actuallythesestudiescanbeconsideredassimulationbutnotoptimizationbecausetheselectedoptimumprogramisbasedonthesimulationresults.
Therefore,itisthestatingpointofthispaperthatprobestheoptimizationmethodwhichisbasedontrueBPneuralnetwork.
AnunconstraintoptimizationmethodispresentedinthisresearchtosolvetheblackboxproblembasedonBPneuralnetworktoobtaintheexperimentalandobserveddatawithoutknowingthefunctionalrelationbetweeninputandoutputandtoachievetheoutputoptimizationbyadjustingtheinputvalues.
WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,thefittingandoptimizationofBPneuralnetworkiscombined.
ThecombinationexpandstheapplicationdomainofBPneuralnetworkandsolvestheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
Inaddition,constrainedoptimizationproblemsbasedonBPneuralnetworkwillbediscussedinfurtherstudies.
Thispapercanbedividedintofiveparts.
Inthefirstpart,theaimandsignificanceoftheresearchandthestatusquoisdiscussed,basedontheBPneuralnetwork'sadaptivecharacteristics.
ThesecondpartismainlyabouttheBPneuralnetwork'sstructureanditsalgorithm.
TheunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiveninthethirdpart.
Anillustrationoftheresearchisgiveninthefollowingpart.
Finally,theachievementoftheresearchisdemonstrated.
2.
BPneuralnetworkstructureanditsalgorithm2.
1.
BPneuralnetworkstructure.
BPneuralisamulti-layerforward-network.
Thethree-tiernetworkstructureisusuallyused[20].
Itsnetworkstructureshowsasfollows.
(Figure1).
Figure1Three-layersBPnetworkstructure2.
2BPneuralnetworkalgorithm.
(1)Forwardpropagationprocess.
Inputsignalstartingfromtheinputlayerpassesthroughthehiddenlayerunitsaretransmittedtotheoutputlayer,andfinallytheoutputsignalisgeneratedattheoutputlayer.
Iftheoutputsignalsmeetthegivenoutputrequirement,thecalculationisterminated.
Iftheoutputsignalsdoesnotmatchthegivenoutputrequirement,thesignalswillbetransferredtotheerrorsignalback-propagation.
Theforwardpropagationprocessiscalculatedasfollows:Supposetheinputlayerhasq+1inputsignals,anyoftheinputsignalsisexpressedwithi.
Thehiddenlayerhasp+1neuronsandanyoftheneuronsisexpressedwithj.
TheoutputlayerhasOoutputneuronsandanyoftheneuronsisexpressedwithk.
Theweightoftheinputlayerandthehiddenlayerisexpressedwithvij(i=0,1,2,…,q;j=1,2,…,p)andv0jisthehiddenlayerthresholdvalue.
Theweightofthehiddenlayerandoutputlayerisexpressedwithujk(j=0,1,2,…,p;k=1,2,…,o)andu0kistheoutputlayerthresholdvalue.
Supposehiddeninputisthenetj(j=1,2,…,p),thehiddenlayeroutputisyj(j=1,2,…,p),theinputofoutputlayerisnetk(k=1,2,…,o),andtheoutputofoutputlayeriszk(k=1,2,…,o).
SupposethetrainingsamplesetisexpressedasX=[X1,X2,…,Xr,…,Xn]correspondtoanyoftrainingsamplesasXr=[xr0,xr1,xr2,…,xrq](r=1,2,…,n,xr0=-1).
Theactualoutputanddesiredoutputareexpressedaszr=[zr1,zr2,…,zro]Tanddr=[dr1,dr2,…,dro]Trespectively.
Whenweletmbetheiterationnumber,theweightandtheactualoutputarethefunctionofm.
Forsignalforwardpropagationprocess,ifletXrbenetworkinputtrainingsamples,thenwehave:(1)(2)(3)(4)Foraboveequations,the-thneuronserrorsignaloftheoutputlayerisandthekneuronserrorenergyisdefinedas.
Thesumoftheerrorenergyoftheallneuronsofoutputlayeris:(5)If(eexpressesexpectedcalculatingaccuracy),thecalculationwillbefinished,otherwise,theback-propagationcomputingwillbecarriedout.
(2)Errorback-propagationprocess.
Theerrorsignalisthemarginbetweentheactualnetworkoutputandthedesiredoutput.
Errorsignaloutputpointstartsforwardpropagationlayerbylayer,whichistheback-propagationoftheerrorsignal.
Intheerrorsignalback-propagationprocess,thenetworkweightisadjustedbytheerrorfeedback.
Throughtherepeatingmodificationoftheweighttheactualnetworkoutputgraduallyapproachesthedesiredoutput.
Errorback-propagationprocessiscalculatedasfollows:(6)(7)(8)(9)Whereisthelearningrateandisagivenconstant.
Aftercalculationofthenewweightofvariouslayers,thecalculationisturnedtotheforwardpropagationprocess.
3.
TheunconstraintoptimizationmethodbasedonBPNeuralNetwork3.
1.
Mathematicalmodel.
Since,themaximumnetworkoutputisusedasanexampleforconveniencetoillustratetheprobleminthispaper.
WhenletF(X)betherelationbetweeninputandoutput,amathematicalmodelofunconstraintoptimizationbasedonBPneuralnetworkcanbeexpressedasfollows:(10)WhereXistheinputvector,X=(x1,x2,…,xq)TandZistheoutputofBPneuralnetwork.
3.
2.
Basicideas.
ThegradientofoutputintheX(0)pointiscalculatedfirstbasedonartificially-selectedorrandomly-selectedinitialpointX(0).
IfthegradientofX(0)pointisnot0,itmustbepossibletofindanewpointX(1)thatisbetterthanX(0)inthedirectionoftheX(0)pointgradient,anditispossibletosolvethegradientofX(1)point.
IfthegradientofX(1)pointisnot0,anewpointX(2)thatisbetterthanX(1)inthedirectionoftheX(1)pointgradientwillbecalculated.
Thisprocesscontinuesuntilthegradient0isobtainedorabetterpointcannotbefound(atthispoint,theproductofgradientandstepsizeislessthanorequaltothethresholdvalueofcomputer.
Atthispoint,theXvalueistheoptimalinputandthecorrespondingnetworkoutputistheoptimaloutput.
3.
3.
Thepartialderivativesofthenetworkofoutputtoinput.
AsthegradientvectoroffunctionF(X)is(11)Thus,aslongasthepartialderivativeoffunctionF(X)iscalculated,thegradientoffunctionF(X)canbeobtained.
ThefollowingistheproceduretoderivethepartialderivativeoftheBPneuralnetworkoutputversusitsinput.
BPneuralnetworktransferfunctionisgenerallyunipolarSigmoidfunctionexpressedasequation(12)(12)TakethistransferfunctionasanexampletoderivethepartialderivativesoftheBPneuralnetwork'soutputversusinput.
Sincethederivativeoff(x)is(13)(k=1,2,…,o;i=1,2,…,q;j=1,2,…,p)(14)(15)(16)(17)(18)So(19)Iflet(20)(21)Thenwehave(22)3.
4.
Theunconstraintoptimizationmethod.
IfX(0)isartificiallyselectedorrandomlyselectedastheinitialpointX(0)andX(m)issupposedtobegradientobtainedatpointofmthiterations,thenthegradientofX(m)canbecalculatedbyequation(23)(23)IfX(m)meetstheiterationterminationconditiongivenby,(24)Whereisthestepfactorand,andisthepre-specifiedprecisionvalue.
TheoptimalsolutionX*=X(m)(25)CorrespondingZ*isoptimalvalue.
IfZ*doesnotsatisfyequation(24),thenlet(26)(27)TheadjustmentmethodforX(m+1)isdescribedinthefollowingtwocases:Case1:X(m+1)isnotsuperiortoX(m)Let(28)Accordingtotheequation(26),equation(27),anewX(m+1)valueisobtainedbyrecalculating,thenitcanbejudgedwhetherX(m+1)issuperiortoX(m),namelywhetheritsatisfiesequation(29)ornot.
(29)IfX(m+1)satisfiestheequation(29),thenextiterationbegins.
IfX(m+1)doesnotmeettheequation(29),equation(28)isusedtoreducethestepsize.
Ifequations(26)and(27)areusedtorecalculate,anewX(m+1)isobtained,thendeterminewhetheritsatisfiesequation(29).
Ifnot,thenthesestepsarerepeateduntilthenewcalculatedX(m+1)issuperiortoX(m),thisiterationwillcometotheend.
Case2:X(m+1)issuperiortoX(m)Let(30)andanewX(m+1)valueisobtainedbyrecalculatingwithequation(26)and(27),thendetermineifX(m+1)issuperiortoX(m)andsatisfiesequation(29).
Ifnot,let(31)inthesametimethestepsizeisreducedbyhalfandthisiterationisdown.
IfthenewcalculationofX(m+1)satisfiesequation(29),thenassigntheX(m+1)valuestoX(m)byusingequation(32).
(32)Atthistime,stepsizeiscontinuetobeincreasedusingequation(30),andisrecalculatedbyequations(26)and(27)untiltheX(m+1)valuewhichisnotsuperiortoX(m)isobtained,thenX(m+1)isreplacedbyX(m).
Inthesametime,thestepsizeisreducedbyhalfandthisiterationisover.
Aftercompletionofeachiterations,ithastobejudgedifresultsmeettheiterationterminationcondition(iftheresultsmeetequation(24)).
Ifitsatisfiesequation(24),thenletX*=X(m+1)(33)WhereX*istheoptimalsolutionandthecorrespondingZ*istherequiredoptimalvalue.
Ifitdoesnotsatisfyequation(24),thenextiterationbeginsuntilX(m)valuessatisfyequation(24),thentheoptimizationisover.
4.
DemonstrationcalculationBecauseoftheoptimalvalueoftheblackboxproblemisunknown,itisdifficulttovalidatetheaccuracyandstabilityofoptimizationmethod.
Toovercomethisproblem,weselecttwoknownfunctionsfordiscretizationandthenusethediscretedsamplestoconductBPneuralnetworkfittingtraining.
Theoptimalvalueofnetworkoutputisobtainedthroughthefittingtrainingsothattheoptimizedresultscanbecomparedwiththetheoreticaloptimalvalue.
4.
1Example1.
Letequation(34)beknownfunction,(34)thenthetheoreticalmaximumvalueofthisfunctionismaxF(X)=maxF(57.
625,51.
136,1)=2045.
412.
Inthisexample,BPneuralnetworkwillbeusedtomakefunctionfittingandthenetworkoutputmaximumafterfittingcomesout.
ThefirststepisdiscretizationofthefunctionF(X).
Sixpointswithequalintervalsareselectedforx1,x2,andx3intherangeof30to80,25to75,and-10to15respectively,atotalof216points.
ThecorrespondingvaluesofF(x)arecalculatedinequation34andthenBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis3-25-1.
Whenthenetworkmatchesthepre-specifiedprecisione=10-4,thenetworkweightandthresholdarekept.
Underthiscondition,theaveragerelativeerrorofthenetworkfittingis0.
002854%.
Second,themaxF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthisarticle.
Table1showsthesamemax(F)valuescalculatedwithtendifferentinitialpointsatε=0.
InTable1,istheaveragevalueofoptimizedresultsfor10timesandβisthestabilityindicatorformeasuringoptimizedresults(equation35).
(35)Table1CalculationresultsofF(x)valuesNum.
12345678910X(0)3040506070803040508025354555657545657535-10-5051015510010x157.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
69757.
697x251.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
16751.
167x31.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
0001.
000maxF2045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
3292045.
329βi11111111114.
2Example2.
Letequation(36)beknownfunctionF(x)=x2-20x+205(36)TheminimumvalueofthefunctionisminF(x)=minF(10)=105.
ThefollowingistheprocessofachievingfunctionfittingandseekingthenetworkminimumoutputafterfunctionfittingwithBPneuralnetwork.
First,turningtheminimumF(x)intomaxF(x)usingequation(37)maxF(x)=min[-F(x)](37)AndthendiscretethefunctionF(x),inwhich81pointsistakeninrangeof0to20at0.
25interval.
ThenthecorrespondingvaluesofF(x)arecalculatedusingequation(37)andBPneuralnetworkisusedtoachievefunctionfitting.
Atthisstep,thenetworkstructureis1-10-1.
Whenthenetworkmeetsthepre-specifiedprecisione=10-5,thenetworkweightandthresholdarekept,thenatthispoint,theaveragerelativeerrorofthenetworkfittingis0.
1004%.
ThenminF(x)canbecalculatedbyusingtheoptimizationmethodgiveninthispaper.
Table2showstheresultsthatarecalculatedwithtendifferentinitialpointsatε=0.
Intable2,istheaveragevalueofoptimizationresultsfor10times.
Table2CalculationresultsofF(x)Num.
12345678910X(0)02.
254.
757.
259.
7512.
2514.
7517.
2519.
7520x9.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
9719.
971minF105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003105.
003βi1111111111Table1andTable2showthattheresultfromoptimizedmethodisverystableandtheoptimalvalueofF(x)isveryclosetothetheoreticaloptimalvalue.
Inexample1,theaveragerelativeerrorsofx1,x2,andx3are0.
125%,0.
061%,and0%respectively.
ThemaximumfunctionvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00406%.
Inexample2,theaveragerelativeerrorofxis0.
29%.
ThemaximumvaluewithBPneuralnetworkoptimizationisalsoveryclosetothetheoreticalmaximumvalueandtherelativeerrorisonly0.
00286%.
Inaddition,theseerrorsalsoincludethefittingerrors.
Theresultsindicatethattheaccuracyofnetworkoptimizationisrelativelyhigh.
5.
Conclusion.
(1)AnunconstrainedoptimizationmethodisproposedbasedonBPneuralnetwork,whichispowerfultosolvetheblackboxproblem.
(2)WhenBPneuralnetworkisintermsoffitting,thismethodcanadjusttheinputvaluesofBPneuralnetworkaccordingtodeterminationofthemaximumandminimumoutputvalues.
Therefore,withthismethodtheapplicationofBPneuralnetworkisexpandedbycombiningBPnetwork'sfittingandoptimizationtogether.
Inaddition,anewmethodisprovidedintheresearchtosolvetheissuesrelatedtoblackboxoptimizationandanewplatformisalsosetupforfurtherstudyoftheconstrainedoptimizationproblemsbasedonBPneuralnetwork.
(3)Inthisresearch,ageneralmathematicalmodelforBPneuralnetworkunconstraintoptimizationisestablishedusingunipolarSigmoidfunctionasthetransferringfunctionandmaximizingnetworkoutputvalues.
Basedonthismodel,anfundamentalideaofunconstrainedoptimizationmethodbasedonBPneuralnetworkisgiven,thepartialderivativeequationsfortheBPneuralnetwork'soutputtoinputisderived(4)Thestepsizeofoptimizationmethodpresentedinthispaperhasinheritance,whichacceleratestheoptimizationspeed.
(5)Themodelisvalidatedbydemonstrationofsamplecalculationandresultsshowthatthealgorithmisaneffectivemethod.
AcknowledgmentTheresearchissupportedbyNationalNaturalScienceFoundationofChinaandNationalHighTechnologyResearchandDevelopmentProgramofChina(GrantNo.
31071331、2006AA10A310-1)REFERENCES[1]YUWei-ping,PENGYi-gong.
IntelligentControlTechnologyResearch[C].
EighthConferenceonIndustrialInstrumentationandAutomationAcademicMeetingpaper,2007,415-418[2]LIShu-rong,YANGQing,GUOShu-hui.
Neuralnetworkbasedadaptivecontrolforaclassofnonaffinenonlinearsystems[J].
SystemsScienceandMathematics,2007,27(2):161-169[3]CHENMing-jie,NIJin-ren,CHAKe-maietc.
Applicationofgeneticalgorithm-basedartificialneuralnetworksin2Dtidalflowsimulation[J].
JournalofHydraulicEngineering,2003,(10):1-12[4]ZHOULing,SUNJun,YUANYu-bo.
EffectsofcombinedactivationfunctiononBPalgorithm'sconvergencespeed[J].
JournalofHohaiUniversity,1999,27(5):107-108.
[5]TANGWan-mei.
ThestudyoftheoptimalstructureofBPneuralnetwork[J].
SystemsEngineeringTheory&Practice,2005,(10):95-100[6]FunahashiK.
Ontheapproximaterealizationofcontinuousmappingsbyneuralnetworks[J].
NeuralNetworks,1989,2(7):183-192[7]Hecht-NielsonR.
Theoryofthebackpropagationneuralnetworks[M].
WashingtonD.
C.
ProceedingsofIEEEinternationalJointconferenceonNeuralNetworks.
1989[8]Zhang,Y.
,WuL.
.
WeightsoptimizationofneuralnetworkviaimprovedBCOapproach[J].
ProgressInElectromagneticResearch,PIER83,185-198,2008[9]WANGWen-jian.
TheoptimizationofBPneuralnetworks[J].
ComputerEngineeringandDesign,2000,21(6):8-10[10]ZHANGShan,HEJiannong.
ResearchonOptimizedAlgorithmforBPNeuralNetworks[J].
ComputerandModernization,2009,(1):73-80[11]XingHihua,LinHngyan,ChenHuandong,etal.
SensitivityanalysisofBPneuralnetworkoptimizedbygeneticalgorithmanditsapplicationstofeaturereduction[J].
InternationalReviewonComputersandSoftware.
2012,7(6):3084-3089.
[12]ChunshengDong,Liudong,MingmingYang.
TheApplicationoftheBPNeuralNetworkintheNonlinearOptimization.
AdvancesinIntelligentandSoftComputing[J],2010,78:727-732[13]ShifeiDing,ChunyangSu,JunzhaoYu.
AnoptimizingBPneuralnetworkalgorithmbasedongeneticalgorithm[J].
ArtificialIntelligenceReview,2011,36(2):153-162.
[14]LiSong,LiuLijun,ZhaiMan.
Predictionforshort-termtrafficflowbasedonmodifiedPSOoptimizedBPneuralnetwork[J].
SystemsEngineering-Theory&Practice,2012.
39(9):2045-2049.
[15]XingHihua,LinHngyan.
AnintelligentmethodoptimizingBPneuralnetworkmodel[C].
2ndInternationalConferenceonMaterialsandProductsManufacturingTechnology,ICMPMT2012,2012,2470-2474.
[16]Merad,L.
,Bendimerad,F.
T.
,Meriah,S.
M.
,etal.
NeuralNetworksforsynthesisandoptimizationofantennasarrays[J].
RadioengineeringJournal,2007,16(1):23-30[17]GulatiT.
,ChakrabartiM.
,SinghA.
,etal.
ComparativeStudyofResponseSurfaceMethodology,ArtificialNeuralNetworkandGeneticAlgorithmsforOptimizationofSoybeanHydration[J].
FoodTechnolBiotechnol,2010,1(48):11-18[18]WANGXin-min,ZHAOBin,WANGXian-lai.
Optimizationofdrillingandblastingparametersbasedonback-propagationneuralnetwork[J].
JournalofCentralSouthUniversity(NaturalScience),2009,40(5):1411-1416[19]LIULei.
Indextrackingoptimizationmethodbasedongeneticneuralnetwork[J].
SystemsEngineeringTheory&Practice,2010,30(1):22-29[20]HANLi-qun.
Artificialneuralnetworktutorial[M].
BeijingUniversityofPostsandTelecommunicationsPress,2006,12

轻云互联22元/月,美国硅谷、圣何塞CN2GIA云服务器,香港沙田cn2建站vps仅25元/月

轻云互联怎么样?轻云互联,广州轻云网络科技有限公司旗下品牌,2018年5月成立以来,轻云互联以性价比的价格一直为提供个人,中大小型企业/团队云上解决方案。本次轻云互联送上的是美国圣何塞cn2 vps(免费50G集群防御)及香港沙田cn2 vps(免费10G集群防御)促销活动,促销产品均为cn2直连中国大陆线路、采用kvm虚拟技术架构及静态内存。目前,轻云互联推出美国硅谷、圣何塞CN2GIA云服务器...

瓜云互联:全场9折优惠,香港CN2、洛杉矶GIA高防vps套餐,充值最高返300元

瓜云互联怎么样?瓜云互联之前商家使用的面板为WHMCS,目前商家已经正式更换到了魔方云的面板,瓜云互联商家主要提供中国香港和美国洛杉矶机房的套餐,香港采用CN2线路直连大陆,洛杉矶为高防vps套餐,三网回程CN2 GIA,提供超高的DDOS防御,瓜云互联商家承诺打死退款,目前商家提供了一个全场9折和充值的促销,有需要的朋友可以看看。点击进入:瓜云互联官方网站瓜云互联促销优惠:9折优惠码:联系在线客...

日本CN2、香港CTG(150元/月) E5 2650 16G内存 20M CN2带宽 1T硬盘

提速啦简单介绍下提速啦 是成立于2012年的IDC老兵 长期以来是很多入门级IDC用户的必选商家 便宜 稳定 廉价 是你创业分销的不二之选,目前市场上很多的商家都是从提速啦拿货然后去分销的。提速啦最新物理机活动 爆炸便宜的香港CN2物理服务器 和 日本CN2物理服务器香港CTG E5 2650 16G内存 20M CN2带宽 1T硬盘 150元/月日本CN2 E5 2650 16G内存 20M C...

雅虎免费邮箱为你推荐
湖南商标注册在湖南商标注册到底有什么用,不就是一个图标吗?不兼容vivo手机和软件不兼容怎么办?不兼容安卓手机软件不兼容怎么办?9flash在“属性”对话框中的“Move”后面的框中输入Flash动画文件的绝对路径及文件名,这句话怎么操作?唱吧电脑版官方下载唱吧有电脑版吗godaddyGodaddy域名怎么接受2012年正月十五农历2012年正月15早上9点多生的!命里缺什么!是什么命相二层交换机什么是二层交换机和三层交换机???小米手柄小米手柄能连几个手机怎么上传音乐怎样可以上传本地音乐到网上?
免费域名空间申请 阿里云os 精品网 512av 账号泄露 警告本网站 java空间 本网站在美国维护 40g硬盘 创梦 gspeed 谁的qq空间最好看 股票老左 1g空间 linux服务器维护 如何建立邮箱 空间租赁 我的世界服务器ip 数据库空间 xuni 更多