molecularwww.147ttt.com

www.147ttt.com  时间:2021-04-06  阅读:()
arXiv:1108.
5397v1[stat.
ML]26Aug2011Predictionofpeptidebondinganity:kernelmethodsfornonlinearmodelingCharlesBergeronDepartmentofMathematicalSciencesTheresaHepburn,C.
MatthewSundling,MichaelKreinBillKatt,NagamaniSukumar,CurtM.
BrenemanCenterforBiotechnologyandInterdisciplinaryStudiesKristinP.
BennettDepartmentsofMathematicalSciencesandComputerScienceRensselaerPolytechnicInstituteTroy,NY,12180Tuesday18thSeptember,2018AbstractThispaperpresentsregressionmodelsobtainedfromaprocessofblindpredictionofpeptidebindinganityfromprovideddescriptorsforseveraldistinctdatasetsaspartofthe2006ComparativeEvaluationofPredictionAlgorithms(COEPRA)contest.
Thispaperndsthatkernelpartialleastsquares,anonlinearpartialleastsquares(PLS)algorithm,outperformsPLS,andthattheincorporationoftransferableatomequivalentfeaturesimprovespredictivecapability.
Keywords:chemometrics,peptidebondinganity,machinelearning,kernelpartialleastsquares,transferableatomequivalentdescriptorsListofacronyms:ComparativeEvaluationofPredictionAlgorithms(CO-EPRA),cross-validation(CV),kernelpartialleastsquares(KPLS),leave-one-out(LOO),partialleastsquares(PLS),reproducingkernelHilbertspace(RKHS),TransferableAtomEquivalent(TAE),two-dimensional(2D)E-mailaddress:chbergeron@gmail.
com11IntroductionComparativeEvaluationofPredictionAlgorithms(COEPRA,http://www.
coepra.
org/)isamodelingcompetitionorganizedtoprovideobjectivetestingofvariousalgorithmsviatheprocessofblindpredictionforchemical,biological,andmedicaldata.
COEPRA'sstatedgoalsaretoadvancemodelingalgorithmsandsoftwareaswellasprovidereferencedatasetstotheresearchcommunity.
TransferableAtomEquivalent(TAE)RECONfeaturesareelectron-densityderiveddescriptorsobtainedbyfragmentreconstruction.
MOEfeaturesaregeometrical,structural,physiochemicalandtopological2Ddescriptors.
RADfeaturesaretopologicalautocorrelationsofTAERECONdescriptors.
ThispapershowshowtheiradditiontotheCOEPRAdescriptorsimprovemodelingperformances.
Partialleastsquares(PLS)regressionisamachinelearningtechnique.
Becauseitconsidersthecovarianceoftheinputstotheoutputtoguidetheselectionoffeatures,itismuchmorestablethanmultiplelinearregression.
Thisapproachwasdevelopedfor,andispopularwith,theeldofchemometricswherethenumberofvariablesismuchgreaterthanthenumberofsamples,andwhereahighoccurrenceofcorrelatedrepresentationsexists[1].
Lesswell-knowntothechemometricscommunityiskernelpartialleastsquares(KPLS)regression,ageneralizationofPLSthatndsanonlinearrelationbetweenfeaturesinsteadofbeinglimitedtoalinearcombinationthereof[2].
ThispaperdemonstrateshowKPLSlargelyoutperformsPLSintheCOEPRAdatasets.
Theremainderofthispaperaccomplishesthefollowing:TheCOEPRAdatasetsaredescribed.
TAERECONfeaturesaredened,andSIMILscoresareintroduced.
PLSisgeneralizedtoanonlinear,KPLSframework.
ImplementationissuesforKPLSarediscussed.
Modelssubmittedtothecontestandtheirperformancesarestated.
Post-contestanalysisofthedatasetsresultinginnew,betterperformingmodelsispresented.
Conclusionsforthepaperareprovided.
2NotationLetxdenoteacolumnvector.
LetxTdenotethetransposeofx.
LetXdenoteamatrix,XTit'stransposeandX1it'sinverse.
WritetheidentitymatrixofappropriatesizeasI.
TheexpectedvalueofasetofnumberassembledintovectorxiswrittenE(x)andrepresentsthemeanvalue:E(x)=1nni=1xi.
TheEuclideannormx2ofxisgivenbyx2=ni=1x2i.
2Table1:Basicinformationaboutthedatasets.
datasetcalibrationpredictionaminoCOEPRAsamplespredictionacidsdescriptors189889578727676851443133133957873COEPRAdatasetsThreeregressiontaskswereproposedduringthesummerof2006.
Foreachtask,adatasetconsistsofacalibrationsetandapredictionset.
Thefollowingdataareprovided:Forthecalibrationset,theCOEPRAdescriptorsandthecorrespondingresponsesforeachsample.
Forthepredictionset,theCOEPRAdescriptorsforeachsample.
Hence,thecalibrationsetisusedtodevelopamodel,andthismodelispropagatedtopredictionsetdescriptorstomakeapredictionfortheresponse.
Thesepredictionsarecomparedwiththeactualvaluesbythecontestorganizersafterthecloseofthesubmissiondate.
Eachsampleconsistsofapeptidesequenceofaminoacidresidues(rounds1and3involvenonapeptideswhileround2involvesoctapeptides)and643COEPRAdescriptorsperaminoacid.
Thenatureofthesedescriptorsremainunknowntothistime.
Table1presentsbasiccharacteristicsofeachround.
Duringthecontest,thenatureoftheregressionvaluewasnotknown.
Thatisnolongerthecase.
Forround1,theoutputisthebondinganitytoHLA-A*0201majorhistocompabilitycomplex[3].
Forround2,theoutputisthebindinganityofmouseclassImajorhistocompabilitycomplex[4].
Forround3,theoutputisthebondinganitytoHLA-A*0201majorhistocompabilitycomplex[5].
Thecontestmethodologyproposed,foreachround,tolearnamodelfromcalibrationdata,forwhichtheresponseisknown,andthenpropagatethatmodeltopredictiondatatoestablishapredictionoftheresponsethatcouldthenbeevaluatedbycontestorganizersagainstthetruevalues.
Contestantswerefreetoaddadditionaldescriptors;thusweaugmentedtheCOEPRAdescriptorswithMOEandRADdescriptorsthatwenowdescribe.
4AtomicchargedensityfragmentfeaturesRECONisanalgorithmfortherapidreconstructionofmolecularelectrondensitiesandelectrondensity-basedpropertiesofmolecules,usingpre-computed3atomicchargedensityfragmentsandassociateddescriptorsstoredinaTrans-ferableAtomEquivalentlibrary.
MolecularTAEdescriptorsareconstructedinmostcasesbysummationoftherespectiveatomicfragmentcontributions.
TheTAEtechnology[6,7,8,9]providesarapidmeansofcomputingelectronicpropertyinformationforlargemoleculardatasets.
Amongthedescriptorsusedinthisstudyaretraditional2DMOEdescriptorsandtopologicalRECONau-tocorrelationdescriptors(RAD),whichareautocorrelationsofsurfaceintegralsofvariouselectrondensityderived(TAE)atomicpropertiesPx,Py:A(Rxy)=1nn(x=1n(y=1PxPy(1)binnedbytheminimumbondpathRxy(topologicaldistance)betweentherespectivepairofatoms(x,y).
Useoftheminimumbondpathallowstopologicalautocorrelationdescriptorstobecomputedwithouttheneedforthree-dimensionalenergyminimizedstructures[10].
Theelectrondensity-derivedpropertiesusedaretheelectrostaticpotential,theelectronickineticenergydensity,gradientsoftheelectrondensityandelectronickineticenergydensitynormaltoanelectrondensityisosurface(correspondingtothemolecularvanderWaalssurface),theFukuifunction,theLaplaciandistributionoftheelectrondensity,thebarenuclearpotentialandalocalaverageoftheionizationpotentialonthesurface.
Thesefeatureshavebeendescribedindetailelsewhere[6,7,8,9](onlineversionforproteinsandpolypeptidesavailableathttp://reccr.
chem.
rpi.
edu/).
Theimplementation[10]oftheRECONalgorithmwithinMOEisusedinthisstudy.
SIMILscoresareanewtypeofsimilarityscore,newtothisstudy,betweenpairsofaminoacidresidues.
EachSIMILscoreisatwo-partscore,consistingofaClassScoreandaRECONScore.
TheClassScoreisaweightedscoreconstructedoutofbitsrepresentingthepresenceofthefollowingphysicalcharacteristics:tiny,small,positive,negative,polar,non-polar,aliphatic,andaromatic.
TheRECONScoreisconstructedfromweighteddierencesofTAERECONdescriptors.
TheseSIMILscoresappearasa20-by-20similaritymatrix.
5MachinelearningmethodsLinearregressionisbasedonthenotionofthedot-productfunctionsintheEuclideanspace.
Forexample,eachentryofthecovariancematrixCisgivenbythedot-productfunction:c(x,x)=xTx.
(2)Nonlinearregressionscanbeachievedbyusingotherfunctions,calledkernelfunctions,thatsatisfydot-productpropertiesinadierentspacecalledareproducingkernelHilbertspace(RKHS).
AlinearregressioncanbecomputedinRKHSthatisusuallyofmuchhigher(andpossiblyinnite)dimensionality,4resultinginamodelthatisnonlinearinEuclideanspace.
EachentryofthekernelmatrixK,ofsamesizeasC,requiresoneevaluationofthekernelfunction.
Thefactthatthecomputationaleortofworkinginapotentiallyinnite-dimensionalspaceiscappedbythenumberofsamplesiscalledthekerneltrick.
ThecombinationofPLSwithkernelsproducesapowerfulalgorithm:kernelpartialleastsquaresregression[2].
ThemodelisoftheformKβ≈y(3)whereKisasquarekernelmatrixwhosesizeisthenumberofsamplescomputedfromthefeaturesandyisavectorofresponses.
Themost-oftcitedkernelfunctionistheGaussianone,givenbyk(x,x)=expxx222η2(4)wherex,xaresamplevectors.
Avariantistheexponentialkernel:k(x,x)=expxx22η.
(5)Workingwitheitherkernelrequiressettingparameterη.
Thevectorofcoecientsβiscalculatedasβ=U(TTKU)1TTy.
(6)ThecolumnsuandtofmatricesUandTarefounditerativelyfromtheKPLSalgorithm[2]:1.
Solveeigenproblem(KyyT)t=λtfort.
2.
Computeu=yyTt.
3.
DeatethekernelmatrixK←(IttT)K(IttT).
Ateachiteration,uandtarechosensoastomaximizethecovariancebetweenthem[2].
ThenumberofcolumnsofUandTisequaltothenumberνoflatentvariablesofthemodel.
Equivalently,thecovariancebetweentheprojectionofyontoKismaximized.
Amodelisevaluatedbycomparingthepredictionz=Kβ(7)againsttheknownvaluesy,andcanbeassessedusingthecorrelationcoecient:r2=1yz22yE(y)22.
(8)56ImplementationissuesDataforeachroundwascenteredandscaledtozeromedianandunitabsolutedeviation.
Foreachround,PLSandKPLSalgorithmswereexecutedinMatlabusingcodesadaptedfrom[11].
Astheobjectiveofthecontestistomaximizetheperformanceofthemodelforthepredictionset,retainedmodelsfromthecalibrationsetmustberobust.
Thisisachievedbyleave-one-out(LOO)cross-validation(CV).
Foracalibrationsetconsistingofsamples,thisprocedureinvolvesusingasinglesampleforvalidationandtheremaining1samplesfortraining.
ThetrainingdataisusedtogeneratePLS/KPLSmodelsandthevalidationdataisusedformodelassessment.
Thisisrepeatedtimes,suchthateachsampleisusedonceforvalidation.
Then,acorrelationcoecient(Eq.
8)canbecalculatedfromeachsample'scross-validatedprediction.
TheframeworkofLOOCVpermitsthesettingofthemodelhyperpa-rameters.
ForPLS,thesolehyperparametersisthenumberνoflatentvariables.
ForKPLS,bothνandthekernelparameterηmustbeset.
Foreachattemptedcombinationof{ν,η},acalibrationsetLOOCVr2isobtained,andhyperparametervaluesarechosensoastomaximizethecorrelationcoecient.
Butwhatvaluesof{ν,η}areattemptedThenumberoflatentvariablesisapositiveinteger,andweusedthebrute-forceapproachoftryingallnumbersbetween1and20.
Asforη,itwasoptimizedusingMATLAB'ssimplexsearchprovidedbybuilt-inroutinefminsearch.
7ContestmodelingperformancesThissectionsummarizesthemodelingmethodsusedinthethreeregressiontasks.
Fortherstround,584RECONfeaturesweregeneratedforeachpeptide.
ThesefeatureswereusedtosupplementtheprovidedCOEPRAdescriptors.
ThesubmittedmodelexploitedGaussianKPLSwithacalibrationsetLOOCVr2of0.
7120.
Contestresultsreportanr2of0.
602inthepredictionset,afourth-placenish.
Forthesecondround,147RECONdescriptorsweregeneratedtosupplementtheprovidedCOEPRAdescriptorsforeachsample.
GaussianKPLSresultedinacalibrationsetr2of0.
5799.
Thismodelgaveapredictionr2of0.
735,arst-placeresult.
Moreover,ther2wassignicantlyhigherthanthatofthesecond-placenisherat0.
612,by20.
1%.
Forthethirdround,180SIMILdescriptorswerederived,correspondingto20descriptorsperaminoacid,whichisthenumberofrowsofagivencolumnintheSIMILsimilaritymatrix.
ThesedescriptorswereusedformodelinginadditiontotheCOEPRAones.
TheexponentialKPLSmodelwaschosenthistime,withr2=0.
3737forLOOCVacrossthecalibrationset.
Contestresultsreportr2=0.
201acrossthepredictionset,asecond-placenish.
68FurtheranalysisPost-contest,itispossibletotakeasecondlookatthedatasets,andperformmoreformalanalysesontheCOEPRAdatasets.
Forexample,freedfromthetightdeadlineswithinwhichsubmissionsmustbemade,itispossibletooptimizethekernelparameterηtoahigherlevelofaccuracy,andtryagreaternumberofcombinationsoftheCOEPRA,RECONandSIMILdescriptors.
Despitethefactthattheresponsesforthepredictionsetarenowknown,thisanalysisassumesthattheyarenotforthepurposesofmodelselection.
Hence,modelparameters{ν,η}arechosenbasedonLOOCVacrossthecalibrationset,asbefore,andamodelisselectedbasedonit'scalibrationLOOCVr2acrossthecalibrationset,andnotfromthepredictionset.
Threequestionsemergedfromcontestresults:WhatwastheimprovementofusingKPLSoverthatofPLSmodelsWhatwasthevalue-addedofusingthe2DMOEandRECONautocorre-lationdescriptors(RAD)Whatisthevalue-addedofusingtheSIMILscoresToaddressquestion1,modelsweregeneratedusingPLS,GaussianKPLSandexponentialKPLS.
Toaddressquestions2and3,consistentsetsof3272DMOEandRADfeaturesweregenerated.
Then,modelsweregeneratedusingonlytheCOEPRAdescriptors,onlytheMOE/RADdescriptors,onlytheSIMILdescriptors,boththeCOEPRAandMOE/RAD,COEPRAandSIMIL,andallthreesetsofdescriptors.
Table2presentstheresultsoftheseexperiments.
7Table2:Resultsofpost-contestexperiments.
ThemodelwiththehighestcalibrationsetLOOCVcoecientofcorrelationisbolded.
Themodelwiththehighestpredictionsetcorrelationcoecientisitalicized.
methodPLSKPLSKPLSkernellinearGaussianexponentialcalibrationpredictioncalibrationpredictioncalibrationpredictionround1COEPRA0.
6250.
4550.
7260.
6780.
7210.
691MOE/RAD0.
2610.
3440.
4070.
3860.
4270.
495SIMIL0.
5120.
3520.
5750.
5490.
5830.
618COEPRA+MOE/RAD0.
6800.
4640.
7420.
6610.
7240.
694COEPRA+SIMIL0.
6200.
4590.
7350.
6640.
7210.
693all0.
6750.
4660.
7390.
6630.
7270.
694round2COEPRA0.
2980.
4010.
4980.
7460.
4700.
590MOE/RAD0.
0950.
1440.
3230.
5460.
3010.
441SIMIL0.
1420.
2000.
6130.
4270.
4820.
515COEPRA+MOE/RAD0.
2930.
4030.
5020.
7840.
4640.
591COEPRA+SIMIL0.
2790.
4120.
5050.
7540.
4750.
595all0.
2750.
4140.
5090.
7820.
4690.
596round3COEPRA0.
3020.
1530.
3540.
2000.
3730.
219MOE/RAD0.
162-0.
1350.
1040.
0350.
1770.
200SIMIL0.
2370.
0320.
3350.
1180.
3260.
169COEPRA+MOE/RAD0.
3030.
1780.
3540.
2120.
3750.
242COEPRA+SIMIL0.
3050.
1490.
3560.
1970.
3760.
219all0.
3050.
1730.
3560.
2080.
3770.
2408Forround1,Table2showsthatGaussianKPLSwiththecombinedCO-EPRAandMOE/RADdescriptorsndscalibrationandpredictioncorrelationcoecientsof0.
741and0.
661,respectively.
Thelattercomesveryclosetothecontest'srst-placeresultof0.
677.
Notethatahigherperformancewouldhavebeenachievedhadtheexponentialkernelbeenchosen.
However,theassumptionisthatonlycalibrationsetresponsesareknown.
Hence,theretainedmodelmustbebaseduponperformanceonthecalibrationsetonly.
Alsonotethatalmostidenticalperformances,within0.
010,arefoundifCOEPRA+SIMILorcalldescriptorsareused.
Forround2,itisGaussianKPLS,usingonlytheSIMILdescriptors,thatboaststhehighestcalibrationsetLOOCVr2withavalueof0.
613.
However,itseemsthatthemodelperformancedoesnottranslatewelltothepredictionset,withanr2of0.
427.
Itisnoticedthatallothermodelshaveanimprovedpredictionsetcorrelationcoecient.
Ignoringthatmodelforamoment,itisGaussianKPLSwithalldescriptorsthatoutperformsothermodels,withcalibrationandpredictionr2'sof0.
509and0.
781,respectively.
ItappearsthatanincreasednumberofMOEandRADfeaturesinconcertwithane-tuningofthekernelparameterachievesamodelthathasr2of0.
046(or21.
9%)higherthanthispaper'srst-placecontestsubmissionpresentedintheprevioussection.
NoteagainthatalmostequalresultsareobtainedforCOEPRA+MOE/RAD,COEPRA+SIMILandalldescriptors.
Forround3,theexponentialkernelperformsbetterthantheGaussiankernel.
Onceagain,bothsetsofinputsareused.
Onceagain,resultsbetweenCOEPRA+MOE/RAD,COEPRA+SIMILandalldescriptorsarequasi-identical.
Thebestofthethreehasr2performances0.
375and0.
242acrossthecalibrationandpredictionsets.
Thisbeatsthecontestwinnerby0.
006or2.
5%.
Lookingback,itmakessensethatthevalue-addedofMOE/RADandSIMILaresimilar,sincehalfoftheweightoftheSIMILscoresarebasedonMOE/RADfeatures.
9ConclusionTwoconclusionsstemfromthispaper.
First,inanswertoquestion1relatingtothepossibilityofimprovementofKPLSoverPLS,thispaperndsthatthereisaverysignicantadvantageinusingnonlinearKPLSmodelsoverlinearPLSones.
Second,inanswertoquestions2and3,whileMOE/RADdescriptorsortheSIMILscoresareinsucienttobuildperformingmodelsforthepredictionofbindinganities,theycontributetoimprovedmodelingperformance,inconjunctionwiththeCOEPRAdescriptors.
WithfurtherknowledgeofthenatureoftheCOEPRAdescriptors,itmaybepossibletofurtherspecifythevalue-addedcontributionoftheMOE/RADandSIMILfeatures.
9AcknowledgmentsThisworkwassupportedbyNIHgrant1P20-HG003899-01.
CharlesBergeronwassupportedbyadoctoralfellowshipfromtheFondsquebecoisdelarecherchesurlanatureetlestechnologies.
MargaretMcLellancontributedascriptusedinsequence-to-structureconversion.
References[1]Wold,S.
,Ruhe,H.
,Wold,H.
andDunnIII,W.
J.
(1984)Thecollinearityprobleminlinearregression.
Thepartialleastsquares(PLS)approachtothegeneralizedinverse.
SIAMJournalofScienticandStatisticalComputations.
5:735-743.
[2]Rosipal,R.
andTrejo,L.
J.
(2001)KernelPartialLeastSquaresRegressioninReproducingKernelHilbertSpace.
JournalofMachineLearningResearch.
2:97-123.
[3]Doytchinova,I.
A.
,Walhshe,Valerie,Borrow,PersephoneandFlower,D.
R.
(2005)TowardsthechemometricdissectionofpeptideHLA-A*0201bindinganity:comparisonoflocalandglobalQSARmodels.
JournalofComputer-AidedMolecularDesign.
19:203-212.
[4]Hattotuwagama,C.
K.
,Guan,Pingping,Doytchinova,I.
A.
andFlower,D.
R.
(2004)Newhorizonsinmouseimmunoinformatics:reliableinsilicopredictionofmouseclassIhistocompabilitymajorcomplexpeptidebindinganity.
OrganicandBiomolecularChemistry.
2:3274-3283.
[5]Doytchinova,I.
A.
andFlower,D.
R.
(2002)PhysicochemicalExplanationofPeptideBondingtoHLA-A*0201MajorHistocompatibilityComplex:AThree-DimensionalQuantitativeStructure-ActivityRelationshipStudy.
Pro-teins:Structure,FunctionandGenetics.
48:505-518.
[6]Breneman,C.
M.
andRhem,M.
(1997)AQSPRAnalysisofHPLCColumnCapacityFactorsforasetofHigh-EnergyMaterialsUsingElectronicVanderWaalsSurfacePropertyDescriptorsComputedbytheTransferableAtomEquivalentMethod.
J.
Comput.
Chem.
,18(2),182-197.
[7]Breneman,C.
M.
,Thompson,T.
R.
,Rhem,M.
andDung,M.
(1995)ElectronDensityModelingofLargeSystemsUsingtheTransferableAtomEquivalentMethod,Computers&Chemistry,19(3),161.
[8]Whitehead,C.
E.
,Sukumar,N.
,Breneman,C.
M.
andRyan,M.
D.
(2003)TransferableAtomEquivalentMulti-CenteredMultipoleExpansionMethod.
J.
Comp.
Chem.
,24:512-529.
[9]Sukumar,N.
andBreneman.
C.
M.
(2007)QTAIMinDrugDiscoveryandProteinModelinginTheQuantumTheoryofAtomsinMolecules:FromSolidStatetoDNAandDrugDesign.
(C.
F.
Matta&R.
J.
Boyd,Editors)Wiley-VCH.
[10]Katt,Bill.
(2004)ASemi-AutomatedApproachToMolecularDiscoveryThroughVirtualHighThroughputScreening,RensselaerPolytechnicInstitute,Troy,NewYork.
10[11]Shawe-Taylor,JohnandCristianini,Nello.
(2004)KernelMethodsforPatternAnalysis.
Cambridge:Cambridge,UK.
11

Kinponet是谁?Kinponet前身公司叫金宝idc 成立于2013年 开始代理销售美国vps。

在2014年发现原来使用VPS的客户需求慢慢的在改版,VPS已经不能满足客户的需求。我们开始代理机房的独立服务器,主推和HS机房的独立服务器。经过一年多的发展,我们发现代理的服务器配置参差不齐,机房的售后服务也无法完全跟上,导致了很多问题发生,对使用体验带来了很多的不便,很多客户离开了我们。经过我们慎重的考虑和客户的建议。我们在2015开始了重大的改变, 2015年,我们开始计划托管自己...

老周互联24小时无理由退款,香港原生IP,28元起

老周互联怎么样?老周互联隶属于老周网络科技部旗下,创立于2019年12月份,是一家具有代表性的国人商家。目前主营的产品有云服务器,裸金属服务器。创办一年多以来,我们一直坚持以口碑至上,服务宗旨为理念,为用户提供7*24小时的轮班服务,目前已有上千多家中小型站长选择我们!服务宗旨:老周互联提供7*24小时轮流值班客服,用户24小时内咨询问题可提交工单,我们会在30分钟内为您快速解答!另免费部署服务器...

JustHost,最新高性价比超便宜俄罗斯CN2 VPS云服务器终身8折优惠,最低仅8元/月起,200Mbps带宽不限流量,五大机房自助自由切换,免费更换IP,俄罗斯cn2vps怎么样,justhost云服务器速度及综合性能详细测评报告

主机参考最新消息:JustHost怎么样?JustHost服务器好不好?JustHost好不好?JustHost是一家成立于2006年的俄罗斯服务器提供商,支持支付宝付款,服务器价格便宜,200Mbps大带宽不限流量,支持免费更换5次IP,支持控制面板自由切换机房,目前JustHost有俄罗斯5个机房可以自由切换选择,最重要的还是价格真的特别便宜,最低只需要87卢布/月,约8.5元/月起!just...

www.147ttt.com为你推荐
怎么查询商标怎样查询商标有没有被注册方法有哪些?地图应用用哪个地图导航最好最准地图应用看卫星地图哪个手机软件最好。mathplayer如何学好理科bbs.99nets.com做一款即时通讯软件难吗 像hi qq这类的地陷裂口天上顿时露出一个大窟窿地上也裂开了,一到黑幽幽的深沟可以用什么四字词语来?杨丽晓博客明星的最新博文partnersonline国外外贸平台有哪些?partnersonline我家Internet Explorer为什么开不起来yinrentangzimotang氨基酸洗发水的功效咋样?
vps租用 申请免费域名 独享100m justhost 网站保姆 一点优惠网 圣诞节促销 dd444 七夕促销 hostloc 徐正曦 idc是什么 免费活动 vip购优惠 太原网通测速平台 hktv 超级服务器 双线机房 国外在线代理服务器 秒杀品 更多