molecularwww.147ttt.com

www.147ttt.com  时间:2021-04-06  阅读:()
arXiv:1108.
5397v1[stat.
ML]26Aug2011Predictionofpeptidebondinganity:kernelmethodsfornonlinearmodelingCharlesBergeronDepartmentofMathematicalSciencesTheresaHepburn,C.
MatthewSundling,MichaelKreinBillKatt,NagamaniSukumar,CurtM.
BrenemanCenterforBiotechnologyandInterdisciplinaryStudiesKristinP.
BennettDepartmentsofMathematicalSciencesandComputerScienceRensselaerPolytechnicInstituteTroy,NY,12180Tuesday18thSeptember,2018AbstractThispaperpresentsregressionmodelsobtainedfromaprocessofblindpredictionofpeptidebindinganityfromprovideddescriptorsforseveraldistinctdatasetsaspartofthe2006ComparativeEvaluationofPredictionAlgorithms(COEPRA)contest.
Thispaperndsthatkernelpartialleastsquares,anonlinearpartialleastsquares(PLS)algorithm,outperformsPLS,andthattheincorporationoftransferableatomequivalentfeaturesimprovespredictivecapability.
Keywords:chemometrics,peptidebondinganity,machinelearning,kernelpartialleastsquares,transferableatomequivalentdescriptorsListofacronyms:ComparativeEvaluationofPredictionAlgorithms(CO-EPRA),cross-validation(CV),kernelpartialleastsquares(KPLS),leave-one-out(LOO),partialleastsquares(PLS),reproducingkernelHilbertspace(RKHS),TransferableAtomEquivalent(TAE),two-dimensional(2D)E-mailaddress:chbergeron@gmail.
com11IntroductionComparativeEvaluationofPredictionAlgorithms(COEPRA,http://www.
coepra.
org/)isamodelingcompetitionorganizedtoprovideobjectivetestingofvariousalgorithmsviatheprocessofblindpredictionforchemical,biological,andmedicaldata.
COEPRA'sstatedgoalsaretoadvancemodelingalgorithmsandsoftwareaswellasprovidereferencedatasetstotheresearchcommunity.
TransferableAtomEquivalent(TAE)RECONfeaturesareelectron-densityderiveddescriptorsobtainedbyfragmentreconstruction.
MOEfeaturesaregeometrical,structural,physiochemicalandtopological2Ddescriptors.
RADfeaturesaretopologicalautocorrelationsofTAERECONdescriptors.
ThispapershowshowtheiradditiontotheCOEPRAdescriptorsimprovemodelingperformances.
Partialleastsquares(PLS)regressionisamachinelearningtechnique.
Becauseitconsidersthecovarianceoftheinputstotheoutputtoguidetheselectionoffeatures,itismuchmorestablethanmultiplelinearregression.
Thisapproachwasdevelopedfor,andispopularwith,theeldofchemometricswherethenumberofvariablesismuchgreaterthanthenumberofsamples,andwhereahighoccurrenceofcorrelatedrepresentationsexists[1].
Lesswell-knowntothechemometricscommunityiskernelpartialleastsquares(KPLS)regression,ageneralizationofPLSthatndsanonlinearrelationbetweenfeaturesinsteadofbeinglimitedtoalinearcombinationthereof[2].
ThispaperdemonstrateshowKPLSlargelyoutperformsPLSintheCOEPRAdatasets.
Theremainderofthispaperaccomplishesthefollowing:TheCOEPRAdatasetsaredescribed.
TAERECONfeaturesaredened,andSIMILscoresareintroduced.
PLSisgeneralizedtoanonlinear,KPLSframework.
ImplementationissuesforKPLSarediscussed.
Modelssubmittedtothecontestandtheirperformancesarestated.
Post-contestanalysisofthedatasetsresultinginnew,betterperformingmodelsispresented.
Conclusionsforthepaperareprovided.
2NotationLetxdenoteacolumnvector.
LetxTdenotethetransposeofx.
LetXdenoteamatrix,XTit'stransposeandX1it'sinverse.
WritetheidentitymatrixofappropriatesizeasI.
TheexpectedvalueofasetofnumberassembledintovectorxiswrittenE(x)andrepresentsthemeanvalue:E(x)=1nni=1xi.
TheEuclideannormx2ofxisgivenbyx2=ni=1x2i.
2Table1:Basicinformationaboutthedatasets.
datasetcalibrationpredictionaminoCOEPRAsamplespredictionacidsdescriptors189889578727676851443133133957873COEPRAdatasetsThreeregressiontaskswereproposedduringthesummerof2006.
Foreachtask,adatasetconsistsofacalibrationsetandapredictionset.
Thefollowingdataareprovided:Forthecalibrationset,theCOEPRAdescriptorsandthecorrespondingresponsesforeachsample.
Forthepredictionset,theCOEPRAdescriptorsforeachsample.
Hence,thecalibrationsetisusedtodevelopamodel,andthismodelispropagatedtopredictionsetdescriptorstomakeapredictionfortheresponse.
Thesepredictionsarecomparedwiththeactualvaluesbythecontestorganizersafterthecloseofthesubmissiondate.
Eachsampleconsistsofapeptidesequenceofaminoacidresidues(rounds1and3involvenonapeptideswhileround2involvesoctapeptides)and643COEPRAdescriptorsperaminoacid.
Thenatureofthesedescriptorsremainunknowntothistime.
Table1presentsbasiccharacteristicsofeachround.
Duringthecontest,thenatureoftheregressionvaluewasnotknown.
Thatisnolongerthecase.
Forround1,theoutputisthebondinganitytoHLA-A*0201majorhistocompabilitycomplex[3].
Forround2,theoutputisthebindinganityofmouseclassImajorhistocompabilitycomplex[4].
Forround3,theoutputisthebondinganitytoHLA-A*0201majorhistocompabilitycomplex[5].
Thecontestmethodologyproposed,foreachround,tolearnamodelfromcalibrationdata,forwhichtheresponseisknown,andthenpropagatethatmodeltopredictiondatatoestablishapredictionoftheresponsethatcouldthenbeevaluatedbycontestorganizersagainstthetruevalues.
Contestantswerefreetoaddadditionaldescriptors;thusweaugmentedtheCOEPRAdescriptorswithMOEandRADdescriptorsthatwenowdescribe.
4AtomicchargedensityfragmentfeaturesRECONisanalgorithmfortherapidreconstructionofmolecularelectrondensitiesandelectrondensity-basedpropertiesofmolecules,usingpre-computed3atomicchargedensityfragmentsandassociateddescriptorsstoredinaTrans-ferableAtomEquivalentlibrary.
MolecularTAEdescriptorsareconstructedinmostcasesbysummationoftherespectiveatomicfragmentcontributions.
TheTAEtechnology[6,7,8,9]providesarapidmeansofcomputingelectronicpropertyinformationforlargemoleculardatasets.
Amongthedescriptorsusedinthisstudyaretraditional2DMOEdescriptorsandtopologicalRECONau-tocorrelationdescriptors(RAD),whichareautocorrelationsofsurfaceintegralsofvariouselectrondensityderived(TAE)atomicpropertiesPx,Py:A(Rxy)=1nn(x=1n(y=1PxPy(1)binnedbytheminimumbondpathRxy(topologicaldistance)betweentherespectivepairofatoms(x,y).
Useoftheminimumbondpathallowstopologicalautocorrelationdescriptorstobecomputedwithouttheneedforthree-dimensionalenergyminimizedstructures[10].
Theelectrondensity-derivedpropertiesusedaretheelectrostaticpotential,theelectronickineticenergydensity,gradientsoftheelectrondensityandelectronickineticenergydensitynormaltoanelectrondensityisosurface(correspondingtothemolecularvanderWaalssurface),theFukuifunction,theLaplaciandistributionoftheelectrondensity,thebarenuclearpotentialandalocalaverageoftheionizationpotentialonthesurface.
Thesefeatureshavebeendescribedindetailelsewhere[6,7,8,9](onlineversionforproteinsandpolypeptidesavailableathttp://reccr.
chem.
rpi.
edu/).
Theimplementation[10]oftheRECONalgorithmwithinMOEisusedinthisstudy.
SIMILscoresareanewtypeofsimilarityscore,newtothisstudy,betweenpairsofaminoacidresidues.
EachSIMILscoreisatwo-partscore,consistingofaClassScoreandaRECONScore.
TheClassScoreisaweightedscoreconstructedoutofbitsrepresentingthepresenceofthefollowingphysicalcharacteristics:tiny,small,positive,negative,polar,non-polar,aliphatic,andaromatic.
TheRECONScoreisconstructedfromweighteddierencesofTAERECONdescriptors.
TheseSIMILscoresappearasa20-by-20similaritymatrix.
5MachinelearningmethodsLinearregressionisbasedonthenotionofthedot-productfunctionsintheEuclideanspace.
Forexample,eachentryofthecovariancematrixCisgivenbythedot-productfunction:c(x,x)=xTx.
(2)Nonlinearregressionscanbeachievedbyusingotherfunctions,calledkernelfunctions,thatsatisfydot-productpropertiesinadierentspacecalledareproducingkernelHilbertspace(RKHS).
AlinearregressioncanbecomputedinRKHSthatisusuallyofmuchhigher(andpossiblyinnite)dimensionality,4resultinginamodelthatisnonlinearinEuclideanspace.
EachentryofthekernelmatrixK,ofsamesizeasC,requiresoneevaluationofthekernelfunction.
Thefactthatthecomputationaleortofworkinginapotentiallyinnite-dimensionalspaceiscappedbythenumberofsamplesiscalledthekerneltrick.
ThecombinationofPLSwithkernelsproducesapowerfulalgorithm:kernelpartialleastsquaresregression[2].
ThemodelisoftheformKβ≈y(3)whereKisasquarekernelmatrixwhosesizeisthenumberofsamplescomputedfromthefeaturesandyisavectorofresponses.
Themost-oftcitedkernelfunctionistheGaussianone,givenbyk(x,x)=expxx222η2(4)wherex,xaresamplevectors.
Avariantistheexponentialkernel:k(x,x)=expxx22η.
(5)Workingwitheitherkernelrequiressettingparameterη.
Thevectorofcoecientsβiscalculatedasβ=U(TTKU)1TTy.
(6)ThecolumnsuandtofmatricesUandTarefounditerativelyfromtheKPLSalgorithm[2]:1.
Solveeigenproblem(KyyT)t=λtfort.
2.
Computeu=yyTt.
3.
DeatethekernelmatrixK←(IttT)K(IttT).
Ateachiteration,uandtarechosensoastomaximizethecovariancebetweenthem[2].
ThenumberofcolumnsofUandTisequaltothenumberνoflatentvariablesofthemodel.
Equivalently,thecovariancebetweentheprojectionofyontoKismaximized.
Amodelisevaluatedbycomparingthepredictionz=Kβ(7)againsttheknownvaluesy,andcanbeassessedusingthecorrelationcoecient:r2=1yz22yE(y)22.
(8)56ImplementationissuesDataforeachroundwascenteredandscaledtozeromedianandunitabsolutedeviation.
Foreachround,PLSandKPLSalgorithmswereexecutedinMatlabusingcodesadaptedfrom[11].
Astheobjectiveofthecontestistomaximizetheperformanceofthemodelforthepredictionset,retainedmodelsfromthecalibrationsetmustberobust.
Thisisachievedbyleave-one-out(LOO)cross-validation(CV).
Foracalibrationsetconsistingofsamples,thisprocedureinvolvesusingasinglesampleforvalidationandtheremaining1samplesfortraining.
ThetrainingdataisusedtogeneratePLS/KPLSmodelsandthevalidationdataisusedformodelassessment.
Thisisrepeatedtimes,suchthateachsampleisusedonceforvalidation.
Then,acorrelationcoecient(Eq.
8)canbecalculatedfromeachsample'scross-validatedprediction.
TheframeworkofLOOCVpermitsthesettingofthemodelhyperpa-rameters.
ForPLS,thesolehyperparametersisthenumberνoflatentvariables.
ForKPLS,bothνandthekernelparameterηmustbeset.
Foreachattemptedcombinationof{ν,η},acalibrationsetLOOCVr2isobtained,andhyperparametervaluesarechosensoastomaximizethecorrelationcoecient.
Butwhatvaluesof{ν,η}areattemptedThenumberoflatentvariablesisapositiveinteger,andweusedthebrute-forceapproachoftryingallnumbersbetween1and20.
Asforη,itwasoptimizedusingMATLAB'ssimplexsearchprovidedbybuilt-inroutinefminsearch.
7ContestmodelingperformancesThissectionsummarizesthemodelingmethodsusedinthethreeregressiontasks.
Fortherstround,584RECONfeaturesweregeneratedforeachpeptide.
ThesefeatureswereusedtosupplementtheprovidedCOEPRAdescriptors.
ThesubmittedmodelexploitedGaussianKPLSwithacalibrationsetLOOCVr2of0.
7120.
Contestresultsreportanr2of0.
602inthepredictionset,afourth-placenish.
Forthesecondround,147RECONdescriptorsweregeneratedtosupplementtheprovidedCOEPRAdescriptorsforeachsample.
GaussianKPLSresultedinacalibrationsetr2of0.
5799.
Thismodelgaveapredictionr2of0.
735,arst-placeresult.
Moreover,ther2wassignicantlyhigherthanthatofthesecond-placenisherat0.
612,by20.
1%.
Forthethirdround,180SIMILdescriptorswerederived,correspondingto20descriptorsperaminoacid,whichisthenumberofrowsofagivencolumnintheSIMILsimilaritymatrix.
ThesedescriptorswereusedformodelinginadditiontotheCOEPRAones.
TheexponentialKPLSmodelwaschosenthistime,withr2=0.
3737forLOOCVacrossthecalibrationset.
Contestresultsreportr2=0.
201acrossthepredictionset,asecond-placenish.
68FurtheranalysisPost-contest,itispossibletotakeasecondlookatthedatasets,andperformmoreformalanalysesontheCOEPRAdatasets.
Forexample,freedfromthetightdeadlineswithinwhichsubmissionsmustbemade,itispossibletooptimizethekernelparameterηtoahigherlevelofaccuracy,andtryagreaternumberofcombinationsoftheCOEPRA,RECONandSIMILdescriptors.
Despitethefactthattheresponsesforthepredictionsetarenowknown,thisanalysisassumesthattheyarenotforthepurposesofmodelselection.
Hence,modelparameters{ν,η}arechosenbasedonLOOCVacrossthecalibrationset,asbefore,andamodelisselectedbasedonit'scalibrationLOOCVr2acrossthecalibrationset,andnotfromthepredictionset.
Threequestionsemergedfromcontestresults:WhatwastheimprovementofusingKPLSoverthatofPLSmodelsWhatwasthevalue-addedofusingthe2DMOEandRECONautocorre-lationdescriptors(RAD)Whatisthevalue-addedofusingtheSIMILscoresToaddressquestion1,modelsweregeneratedusingPLS,GaussianKPLSandexponentialKPLS.
Toaddressquestions2and3,consistentsetsof3272DMOEandRADfeaturesweregenerated.
Then,modelsweregeneratedusingonlytheCOEPRAdescriptors,onlytheMOE/RADdescriptors,onlytheSIMILdescriptors,boththeCOEPRAandMOE/RAD,COEPRAandSIMIL,andallthreesetsofdescriptors.
Table2presentstheresultsoftheseexperiments.
7Table2:Resultsofpost-contestexperiments.
ThemodelwiththehighestcalibrationsetLOOCVcoecientofcorrelationisbolded.
Themodelwiththehighestpredictionsetcorrelationcoecientisitalicized.
methodPLSKPLSKPLSkernellinearGaussianexponentialcalibrationpredictioncalibrationpredictioncalibrationpredictionround1COEPRA0.
6250.
4550.
7260.
6780.
7210.
691MOE/RAD0.
2610.
3440.
4070.
3860.
4270.
495SIMIL0.
5120.
3520.
5750.
5490.
5830.
618COEPRA+MOE/RAD0.
6800.
4640.
7420.
6610.
7240.
694COEPRA+SIMIL0.
6200.
4590.
7350.
6640.
7210.
693all0.
6750.
4660.
7390.
6630.
7270.
694round2COEPRA0.
2980.
4010.
4980.
7460.
4700.
590MOE/RAD0.
0950.
1440.
3230.
5460.
3010.
441SIMIL0.
1420.
2000.
6130.
4270.
4820.
515COEPRA+MOE/RAD0.
2930.
4030.
5020.
7840.
4640.
591COEPRA+SIMIL0.
2790.
4120.
5050.
7540.
4750.
595all0.
2750.
4140.
5090.
7820.
4690.
596round3COEPRA0.
3020.
1530.
3540.
2000.
3730.
219MOE/RAD0.
162-0.
1350.
1040.
0350.
1770.
200SIMIL0.
2370.
0320.
3350.
1180.
3260.
169COEPRA+MOE/RAD0.
3030.
1780.
3540.
2120.
3750.
242COEPRA+SIMIL0.
3050.
1490.
3560.
1970.
3760.
219all0.
3050.
1730.
3560.
2080.
3770.
2408Forround1,Table2showsthatGaussianKPLSwiththecombinedCO-EPRAandMOE/RADdescriptorsndscalibrationandpredictioncorrelationcoecientsof0.
741and0.
661,respectively.
Thelattercomesveryclosetothecontest'srst-placeresultof0.
677.
Notethatahigherperformancewouldhavebeenachievedhadtheexponentialkernelbeenchosen.
However,theassumptionisthatonlycalibrationsetresponsesareknown.
Hence,theretainedmodelmustbebaseduponperformanceonthecalibrationsetonly.
Alsonotethatalmostidenticalperformances,within0.
010,arefoundifCOEPRA+SIMILorcalldescriptorsareused.
Forround2,itisGaussianKPLS,usingonlytheSIMILdescriptors,thatboaststhehighestcalibrationsetLOOCVr2withavalueof0.
613.
However,itseemsthatthemodelperformancedoesnottranslatewelltothepredictionset,withanr2of0.
427.
Itisnoticedthatallothermodelshaveanimprovedpredictionsetcorrelationcoecient.
Ignoringthatmodelforamoment,itisGaussianKPLSwithalldescriptorsthatoutperformsothermodels,withcalibrationandpredictionr2'sof0.
509and0.
781,respectively.
ItappearsthatanincreasednumberofMOEandRADfeaturesinconcertwithane-tuningofthekernelparameterachievesamodelthathasr2of0.
046(or21.
9%)higherthanthispaper'srst-placecontestsubmissionpresentedintheprevioussection.
NoteagainthatalmostequalresultsareobtainedforCOEPRA+MOE/RAD,COEPRA+SIMILandalldescriptors.
Forround3,theexponentialkernelperformsbetterthantheGaussiankernel.
Onceagain,bothsetsofinputsareused.
Onceagain,resultsbetweenCOEPRA+MOE/RAD,COEPRA+SIMILandalldescriptorsarequasi-identical.
Thebestofthethreehasr2performances0.
375and0.
242acrossthecalibrationandpredictionsets.
Thisbeatsthecontestwinnerby0.
006or2.
5%.
Lookingback,itmakessensethatthevalue-addedofMOE/RADandSIMILaresimilar,sincehalfoftheweightoftheSIMILscoresarebasedonMOE/RADfeatures.
9ConclusionTwoconclusionsstemfromthispaper.
First,inanswertoquestion1relatingtothepossibilityofimprovementofKPLSoverPLS,thispaperndsthatthereisaverysignicantadvantageinusingnonlinearKPLSmodelsoverlinearPLSones.
Second,inanswertoquestions2and3,whileMOE/RADdescriptorsortheSIMILscoresareinsucienttobuildperformingmodelsforthepredictionofbindinganities,theycontributetoimprovedmodelingperformance,inconjunctionwiththeCOEPRAdescriptors.
WithfurtherknowledgeofthenatureoftheCOEPRAdescriptors,itmaybepossibletofurtherspecifythevalue-addedcontributionoftheMOE/RADandSIMILfeatures.
9AcknowledgmentsThisworkwassupportedbyNIHgrant1P20-HG003899-01.
CharlesBergeronwassupportedbyadoctoralfellowshipfromtheFondsquebecoisdelarecherchesurlanatureetlestechnologies.
MargaretMcLellancontributedascriptusedinsequence-to-structureconversion.
References[1]Wold,S.
,Ruhe,H.
,Wold,H.
andDunnIII,W.
J.
(1984)Thecollinearityprobleminlinearregression.
Thepartialleastsquares(PLS)approachtothegeneralizedinverse.
SIAMJournalofScienticandStatisticalComputations.
5:735-743.
[2]Rosipal,R.
andTrejo,L.
J.
(2001)KernelPartialLeastSquaresRegressioninReproducingKernelHilbertSpace.
JournalofMachineLearningResearch.
2:97-123.
[3]Doytchinova,I.
A.
,Walhshe,Valerie,Borrow,PersephoneandFlower,D.
R.
(2005)TowardsthechemometricdissectionofpeptideHLA-A*0201bindinganity:comparisonoflocalandglobalQSARmodels.
JournalofComputer-AidedMolecularDesign.
19:203-212.
[4]Hattotuwagama,C.
K.
,Guan,Pingping,Doytchinova,I.
A.
andFlower,D.
R.
(2004)Newhorizonsinmouseimmunoinformatics:reliableinsilicopredictionofmouseclassIhistocompabilitymajorcomplexpeptidebindinganity.
OrganicandBiomolecularChemistry.
2:3274-3283.
[5]Doytchinova,I.
A.
andFlower,D.
R.
(2002)PhysicochemicalExplanationofPeptideBondingtoHLA-A*0201MajorHistocompatibilityComplex:AThree-DimensionalQuantitativeStructure-ActivityRelationshipStudy.
Pro-teins:Structure,FunctionandGenetics.
48:505-518.
[6]Breneman,C.
M.
andRhem,M.
(1997)AQSPRAnalysisofHPLCColumnCapacityFactorsforasetofHigh-EnergyMaterialsUsingElectronicVanderWaalsSurfacePropertyDescriptorsComputedbytheTransferableAtomEquivalentMethod.
J.
Comput.
Chem.
,18(2),182-197.
[7]Breneman,C.
M.
,Thompson,T.
R.
,Rhem,M.
andDung,M.
(1995)ElectronDensityModelingofLargeSystemsUsingtheTransferableAtomEquivalentMethod,Computers&Chemistry,19(3),161.
[8]Whitehead,C.
E.
,Sukumar,N.
,Breneman,C.
M.
andRyan,M.
D.
(2003)TransferableAtomEquivalentMulti-CenteredMultipoleExpansionMethod.
J.
Comp.
Chem.
,24:512-529.
[9]Sukumar,N.
andBreneman.
C.
M.
(2007)QTAIMinDrugDiscoveryandProteinModelinginTheQuantumTheoryofAtomsinMolecules:FromSolidStatetoDNAandDrugDesign.
(C.
F.
Matta&R.
J.
Boyd,Editors)Wiley-VCH.
[10]Katt,Bill.
(2004)ASemi-AutomatedApproachToMolecularDiscoveryThroughVirtualHighThroughputScreening,RensselaerPolytechnicInstitute,Troy,NewYork.
10[11]Shawe-Taylor,JohnandCristianini,Nello.
(2004)KernelMethodsforPatternAnalysis.
Cambridge:Cambridge,UK.
11

618云上Go:腾讯云秒杀云服务器95元/年起,1C2G5M三年仅288元起

进入6月,各大网络平台都开启了618促销,腾讯云目前也正在开展618云上Go活动,上海/北京/广州/成都/香港/新加坡/硅谷等多个地区云服务器及轻量服务器秒杀,最低年付95元起,参与活动的产品还包括短信包、CDN流量包、MySQL数据库、云存储(标准存储)、直播/点播流量包等等,本轮秒杀活动每天5场,一直持续到7月中旬,感兴趣的朋友可以关注本页。活动页面:https://cloud.tencent...

Megalayer促销:美国圣何塞CN2线路VPS月付48元起/香港VPS月付59元起/香港E3独服月付499元起

Megalayer是新晋崛起的国外服务器商,成立于2019年,一直都处于稳定发展的状态,机房目前有美国机房,香港机房,菲律宾机房。其中圣何塞包括CN2或者国际线路,Megalayer商家提供了一些VPS特价套餐,譬如15M带宽CN2线路主机最低每月48元起,基于KVM架构,支持windows或者Linux操作系统。。Megalayer技术团队行业经验丰富,分别来自于蓝汛、IBM等知名企业。Mega...

搬瓦工香港 PCCW 机房已免费迁移升级至香港 CN2 GIA 机房

搬瓦工最新优惠码优惠码:BWH3HYATVBJW,节约6.58%,全场通用!搬瓦工关闭香港 PCCW 机房通知下面提炼一下邮件的关键信息,原文在最后面。香港 CN2 GIA 机房自从 2020 年上线以来,网络性能大幅提升,所有新订单都默认部署在香港 CN2 GIA 机房;目前可以免费迁移到香港 CN2 GIA 机房,在 KiwiVM 控制面板选择 HKHK_8 机房进行迁移即可,迁移会改变 IP...

www.147ttt.com为你推荐
甲骨文不满赔偿不签合同不满一年怎么补偿关键字关键字和一般标识符的区别rawtools佳能单反照相机的RAW、5.0M 是什么意思?haole018.comse.haole004.com为什么手机不能放?网站检测如何进行网站全面诊断8090lu.com8090lu.com怎么样了?工程有进展吗?百度指数词为什么百度指数里有写词没有指数,还要购买www.99vv1.comwww.in9.com是什么网站啊?dpscycle痛苦术士PVE输出宏官人放题SBNS-088 中年男の夢を叶えるセックス やりたい放題! 4(中文字幕)种子下载地址有么?好人一生平安
看国外视频直播vps 如何注册网站域名 主机点评 美国主机网 rak机房 42u标准机柜尺寸 警告本网站 福建天翼加速 qingyun 域名评估 服务器是干什么的 中国电信宽带测速网 Updog 个人免费主页 web服务器搭建 海外空间 windows2008 sonya 美国十大啦 gotoassist 更多