[Typetext][Typetext][Typetext]2014TradeScienceInc.
ISSN:0974-7435Volume10Issue24BioTechnologyAnIndianJournalFULLPAPERBTAIJ,10(24),2014[16338-16346]ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisZhaoKunBeihuaUniversity,Teacher'scollege,Jilin,(CHINA)ABSTRACTThispaperintroducesandanalysesthedatamininginthemanagementofstudents'grades.
Weusethedecisiontreeinanalysisofgradesandinvestigateattributeselectionmeasureincludingdatacleaning.
WetakecoursescoreofinstituteofEnglishlanguageforexampleandproducedecisiontreeusingID3algorithmwhichgivesthedetailedcalculationprocess.
Becausetheoriginalalgorithmlacksterminationcondition,weproposeanimprovedalgorithmwhichcanhelpustofindthelatencyfactorwhichimpactsthegrades.
KEYWORDSDecisiontreealgorithm;Englishgradeanalysis;ID3algorithm;Classification.
BTAIJ,10(24)2014ZhaoKun16339INTRODUCTIONWiththerapiddevelopmentofhighereducation,EnglishgradeanalysisasanimportantguaranteeforthescientificmanagementconstitutesthemainpartoftheEnglisheducationalassessment.
Theresearchonapplicationofdatamininginmanagementofstudents'gradeswantstotalkhowtogettheusefuluncoveredinformationfromthelargeamountsofdatawiththedataminingandgrademanagement[1-5].
Itintroducesandanalysesthedatamininginthemanagementofstudents'grades.
Itusesthedecisiontreeinanalysisofgrades.
Itdescribesthefunction,statusanddeficiencyofthemanagementofstudents'grades.
Ittellsushowtoemploythedecisiontreeinmanagementofstudents'grades.
ItimprovestheID3arithmetictoanalyzethestudents'gradessothatwecouldfindthelatencyfactorwhichimpactsthegrades.
Ifwefindoutthefactors,wecanofferthedecision-makinginformationtoteachers.
Italsoadvancesthequalityofteaching[6-10].
TheEnglishgradeanalysishelpsteacherstoimprovetheteachingqualityandprovidesdecisionsforschoolleaders.
Thedecisiontree-basedclassificationmodeliswidelyusedasitsuniqueadvantage.
Firstly,thestructureofthedecisiontreemethodissimpleanditgeneratesruleseasytounderstand.
Secondly,thehighefficiencyofthedecisiontreemodelismoreappropriateforthecaseofalargeamountofdatainthetrainingset.
Furthermorethecomputationofthedecisiontreealgorithmisrelativelynotlarge.
Thedecisiontreemethodusuallydoesnotrequireknowledgeofthetrainingdata,andspecializesinthetreatmentofnon-numericdata.
Finally,thedecisiontreemethodhashighclassificationaccuracy,anditistoidentifycommoncharacteristicsoflibraryobjects,andclassifytheminaccordancewiththeclassificationmodel.
Theoriginaldecisiontreealgorithmusesthetop-downrecursiveway[11-12].
Comparisonofpropertyvaluesisdoneintheinternalnodesofthedecisiontreeandaccordingtothedifferentpropertyvaluesjudgedownbranchesfromthenode.
Wegetconclusionfromthedecisiontreeleafnode.
Therefore,apathfromtheroottotheleafnodecorrespondstoaconjunctiverules,theentiredecisiontreecorrespondstoasetofdisjunctiveexpressionsrules.
Thedecisiontreegenerationalgorithmisdividedintotwosteps[13-15].
Thefirststepisthegenerationofthetree,andatthebeginningallthedataisintherootnode,thendotherecursivedataslice.
Treepruningistoremovesomeofthenoiseorabnormaldata.
Conditionsofdecisiontreetostopsplittingisthatanodedatabelongstothesamecategoryandtherearenotattributesusedtosplitthedata.
Inthenextsection,weintroduceconstructionofdecisiontree.
InSection3weintroduceattributeselectionmeasure.
InSection4,wedoempiricalresearchbasedonID3algorithmandproposeanimprovedalgorithm.
InSection5weconcludethepaperandgivesomeremarks.
CONSTRUCTIONOFDECISIONTREEUSINGID3ThegrowingstepofthedecisiontreeisshowninFigure1.
Decisiontreegenerationalgorithmisdescribedasfollows.
Thenameofthealgorithmis__Generatedecisiontreewhichproduceadecisiontreebygiventrainingdata.
Theinputistrainingsampleswhichisrepresentedwithdiscretevalues.
Candidateattributesetisattribute.
Theoutputisadecisiontree.
Step1.
SetupnodeN.
IfsamplesisinasameclassCthenreturnNasleadnodeandlabelitwithC.
Step2.
Ifattribute_listisempty,thenreturnNasleafnodeandlabelitwiththemostcommonclassinthesamples.
Step3.
Choose_testattributewithinformationgainintheattribute_list,andlabelNas_testattribute.
Step4.
Whileeachiainevery_testattributedothefollowingoperation.
Step5.
NodeNproducesabranchwhichmeetstheconditionof_itestattributeaStep6.
Supposeisissamplesetof_itestattributeainthesamples.
Ifisisempty,thenplusaleafandlabelitasthemostcommonclass.
OtherwiseplusanodewhichwasreturnedbyiGeneratedecisiontreesattributelisttestattribute.
16340ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisBTAIJ,10(24)2014Figure1:GrowingstepofthedecisiontreeANIMPROVEDALGORITHMAttributeselectionmeasureSupposeSisdatasamplesetofsnumberandclasslabelattributehasmdifferentvalues(1,2,,)iCim.
SupposeiSisthenumberofsampleofclassiCinS.
Foragivensampleclassificationthedemandedexpectationinformationisgivenbyformula1[11-12].
1221log(1,2,,,)mjjmjijijiIssKsppiKn(1)12121()VjjmjjjmjjSSSEAISSKSS(2)ipisprobabilitythatrandomsamplebelongstoiCandisestimatedby/iss.
SupposeattributeAhasVdifferentvalues12Vaaa.
WecanuseattributeAtoclassifySintoVnumberofsubset12(,,)VSSS.
SupposeijSisthenumberofclassiCinsubsetjS.
Theexpectedinformationofsubsetisshowninformula2.
12()jjmjSSSSistheweightofthej-thsubset.
ForagivensubsetjSformula3setsup[13].
1221log(1,2,,,)mjjmjijijiIssKsppiKn(3)BTAIJ,10(24)2014ZhaoKun16341ijijjspsistheprobabilitythatsamplesofjsbelongstoclassiC.
IfwebranchinA,theinformationgainisshowninformula4[14].
12mGainAIsssEA(4)TheimprovedalgorithmTheimprovedalgorithmisasfollows.
Function__Generatedecisiontree(trainingsamples,candidateattributeattribute_list){SetupnodeN;IfsamplesareinthesameclassCthenReturnNasleafnodeandlabelitwithC;Recordstatisticaldatameetingtheconditionsontheleafnode;Ifattribute_listisemptythenReturnNastheleafnodeandlabelitasthemostcommonclassofsamples;Recordstatisticaldatameetingtheconditionsontheleafnode;SupposeGainMax=max(Gain1,Gain2,…,Gainn)IfGainMax='85'Updatekssetci_pi='medium'whereci_pj>='75'andci_pj='60'andci_pj<'75'Updatekssetsjnd='high'wheresjnd='1'Updatekssetsjnd='medium'wheresjnd='2'Updatekssetsjnd='low'wheresjnd='3'ResultofID3algorithmTABLE2istrainingsetofstudenttestscoressituationinformationafterdatacleaning.
Weclassifythesamplesintothreecategories.
1"outstanding"C,2"medium"C,3"general"C,1300,s21950s,3880s,3130s.
Accordingtoformula1,weobtain123300,1950,880)(300/3130)Isss2/log(300/3130).
22(1950/3130)log(1950/3130)(880/3130)log(880/3130)1.
256003.
Entropyofeveryattributeiscalculatedasfollows.
Firstlycalculatewhetherre-learning.
Foryes,11210s,21950s,31580s.
112131210,950,580)Isss222(210/1740)log(210/1740)(950/1740)log(950/1740)(580/1740)log(580/1740)1.
074901Forno,1290s,221000s,32300s.
12223290,1000,300)Isss222(90/1390)log(90/1390)(1000/1390)log(1000/1390)(300/1390)log(300/1390)1.
373186.
IFsamplesareclassifiedaccordingtowhetherre-learning,theexpectedinformationis1121311222321740/3130)1390/3130)EwhetherrelearningIsssIsss0.
5559111.
0749010.
4440891.
3731861.
240721.
Sotheinformationgainis1230.
015282GainwhetherrelearningIsssEwhetherrelearning.
Secondlycalculatecoursetype,whenitisA,112131110,200,580sss.
112131222110,200,580)(110/890)log(110/890)(200/890)log(200/890)(580/890)log(580/890)Isss1.
259382.
ForcoursetypeB,122232100,400,0sss.
BTAIJ,10(24)2014ZhaoKun1634312223222100,400,0)(100/500)log(100/500)(400/500)log(400/500)0Isss0.
721928.
ForcoursetypeC,1323330,550,0sss.
132333220,550,0)(0/550)log(0/550)(550/500)log(550/500)0Isss1.
168009.
ForcoursetypeD,14243490,800,300sss.
14243422290,800,300)(90/1190)log(90/1190)(800/1190)log(800/1190)(300/1190)log(300/1190)Isss1.
168009.
112131122232("")(890/3130)500/3130)EcoursetypeIsssIsss132333142434(550/3130)1190/3130)0.
91749.
IsssIsss("")1.
2560030.
917490.
338513Gaincoursetype.
Thirdlycalculatepaperdifficulty.
Forhigh,112131110,900,280sss.
112131222110,900,280)(110/1290)log(110/1290)(900/1290)log(900/1290)(280/1290)log(280/1290)Isss1.
14385.
Formedium,122232190,700,300sss.
122232222190,700,300)(190/1190)log(190/1190)(700/1190)log(700/1190)(300/1190)log(300/1190)Isss1.
374086Forlow,1323330,350,300sss.
1323332220,350,300)(0/650)log(0/650)(350/650)log(350/650)(300/650)log(300/650)0.
995727.
Isss112131122232("")(1290/3130)1190/3130)EpaperdifficultyIsssIsss132333(650/3130)1.
200512.
Isss("")1.
2560031.
2005120.
55497.
GainpaperdifficultyFourthlycalculatewhetherrequiredcourse.
Foryes,112131210,850,600sss16344ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisBTAIJ,10(24)2014112131222210,850,600)(210/1660)log(210/1660)(850/1660)log(850/1660)(600/1660)log(600/1660)Isss1.
220681.
Forno,12223290,1100,280sss12223222290,1100,280)(90/1470)log(90/1470)(1100/1470)log(1100/1470)(280/1470)log(280/1470)Isss1.
015442.
112131122232("")(1660/3130)1470/3130)1.
220681.
EwhetherrequiredIsssIsss("")1.
2560031.
2206810.
035322.
GainwhetherrequiredTABLE2:TrainingsetofstudenttestscoresCoursetypeWhetherre-learningPaperdifficultyWhetherrequiredScoreStatisticaldataDnomediumnooutstanding90Byesmediumyesoutstanding100Ayeshighyesmedium200Dnolownomedium350Cyesmediumyesgeneral300Ayeshighnomedium250Bnohighnomedium300Ayeshighyesoutstanding110Dyesmediumyesmedium500Dnolowyesgeneral300Ayeshighnogeneral280Bnohighyesmedium150Cnomediumnomedium200ResultofimprovedalgorithmTheoriginalalgorithmlacksterminationcondition.
ThereareonlytworecordsforasubtreetobeclassifiedwhichisshowninTABLE3.
TABLE3:SpecialcaseforclassificationofthesubtreeCoursetypeWhetherre-learningPaperdifficultyWhetherrequiredScoreStatisticaldataAnohighyesmedium15Anohighyesgeneral20BTAIJ,10(24)2014ZhaoKun16345Figure2:DecisiontreeusingimprovedalgorithmAllGainscalculatedare0.
00,andGainMax=0.
00whichdoesnotconformtorecursiveterminationconditionoftheoriginalalgorithminTABLE3.
Thetreeobtainedisnotreasonable,soweadopttheimprovedalgorithmanddecisiontreeusingimprovedalgorithmisshowninFigure2.
CONCLUSIONSInthispaperwestudyconstructionofdecisiontreeandattributeselectionmeasure.
Becausetheoriginalalgorithmlacksterminationcondition,weproposeanimprovedalgorithm.
WetakecoursescoreofinstituteofEnglishlanguageforexampleandwecouldfindthelatencyfactorwhichimpactsthegrades.
REFERENCES[1]XueleiXu,ChunweiLou;"ApplyingDecisionTreeAlgorithmsinEnglishVocabularyTestItemSelection",IJACT:InternationalJournalofAdvancementsinComputingTechnology,4(4),165-173(2012).
[2]HuaweiZhang;"LazyDecisionTreeMethodforDistributedPrivacyPreservingDataMining",IJACT:InternationalJournalofAdvancementsinComputingTechnology,4(14),458-465(2012).
[3]Xin-huaZhu,Jin-lingZhang,Jiang-taoLu;"AnEducationDecisionSupportSystemBasedonDataMiningTechnology",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(23),354-363(2012).
[4]ZhenLiu,XianFengYang;"Anapplicationmodeloffuzzyclusteringanalysisanddecisiontreealgorithmsinbuildingwebmining",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(23),492-500(2012).
[5]Guang-xianJi;"Theresearchofdecisiontreelearningalgorithmintechnologyofdataminingclassification",JCIT:JournalofConvergenceInformationTechnology,7(10),216-223(2012).
[6]FuxianHuang;"ResearchofanAlgorithmforGeneratingCost-SensitiveDecisionTreeBasedonAttributeSignificance",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(12),308-316(2012).
[7]M.
SudheepElayidom,SumamMaryIdikkula,JosephAlexander;"DesignandPerformanceanalysisofDataminingtechniquesBasedonDecisiontreesandNaiveBayesclassifierFor",JCIT:JournalofConvergenceInformationTechnology,6(5),89-98(2011).
[8]MarjanBahrololum,ElhamSalahi,MahmoudKhaleghi;"AnImprovedIntrusionDetectionTechniquebasedontwoStrategiesUsingDecisionTreeandNeuralNetwork",JCIT:JournalofConvergenceInformationTechnology,4(4),96-101(2009).
[9]Bor-tyngWang,Tian-WeiSheu,Jung-ChinLiang,Jian-WeiTzeng,NagaiMasatake;"TheStudyofSoftComputingontheFieldofEnglishEducation:ApplyingGreyS-PChartinEnglishWritingAssessment",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,5(9),379-388(2011).
[10]MohamadFarhanMohamadMohsin,MohdHelmyAbdWahab,MohdFairuzZaiyadi,CikFazilahHibadullah;"AnInvestigationintoInfluenceFactorofStudentProgrammingGradeUsingAssociationRuleMining",AISS:AdvancesinInformationSciencesandServiceSciences,2(2),19-27(2010).
16346ApplicationresearchofdecisiontreealgorithminenglishgradeanalysisBTAIJ,10(24)2014[11]HaoXin;"AssessmentandAnalysisofHierarchicalandProgressiveBilingualEnglishEducationBasedonNeuro-Fuzzyapproach",AISS:AdvancesinInformationSciencesandServiceSciences,5(1),269-276(2013).
[12]Hong-chaoChen,Jin-lingZhang,Ya-qiongDeng;"ApplicationofMixed-Weighted-Association-Rules-BasedDataMiningTechnologyinCollegeExaminationgradesAnalysis",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(10),336-344(2012).
[13]YuanWang,LanZheng;"EndocrineHormonesAssociationRulesMiningBasedonImprovedAprioriAlgorithm",JCIT:JournalofConvergenceInformationTechnology,7(7),72-82(2012).
[14]TianBai,JinchaoJi,ZheWang,ChunguangZhou;"ApplicationofaGlobalCategoricalDataClusteringMethodinMedicalDataAnalysis",AISS:AdvancesinInformationSciencesandServiceSciences,4(7),182-190(2012).
[15]HongYanMei,YanWang,JunZhou;"DecisionRulesExtractionBasedonNecessaryandSufficientStrengthandClassificationAlgorithm",AISS:AdvancesinInformationSciencesandServiceSciences,4(14),441-449(2012).
[16]LiuYong;"TheBuildingofDataMiningSystemsbasedonTransactionDataMiningLanguageusingJava",JDCTA:InternationalJournalofDigitalContentTechnologyanditsApplications,6(14),298-305(2012).
官方网站:点击访问白丝云官网活动方案:一、KVM虚拟化套餐A1核心 512MB内存 10G SSD硬盘 800G流量 2560Mbps带宽159.99一年 26一月套餐B1核心 512MB内存 10G SSD硬盘 2000G流量 2560Mbps带宽299.99一年 52一月套餐...
官方网站:点击访问90IDC官方网站优惠码:云八五折优惠劵:90IDCHK85,仅适用于香港CLOUD主机含特惠型。活动方案:年付特惠服务器:CPU均为Intel Xeon两颗,纯CN2永不混线,让您的网站更快一步。香港大浦CN2測速網址: http://194.105.63.191美国三网CN2測速網址: http://154.7.13.95香港购买地址:https://www.90idc.ne...
一年一度的黑色星期五和网络星期一活动陆续到来,看到各大服务商都有发布促销活动。同时RAKsmart商家我们也是比较熟悉的,这次是继双十一活动之后的促销活动。在活动产品中基本上沿袭双11的活动策略,比如有提供云服务器七折优惠,站群服务器首月半价、还有新人赠送红包等活动。如果我们有需要RAKsmart商家VPS、云服务器、独立服务器等产品的可以看看他们家的活动。这次活动截止到11月30日。第一、限时限...
1100lu.com为你推荐
哈利波特罗恩升级当爸电影哈利波特中罗恩一家的红头发为什么后来变成金色的了美国互联网瘫痪美国掐断中国互联网怎么办,我们如何解决?是否有后招?2020双十一成绩单2020双十一尾款如何合并付款?咏春大师被ko练咏春拳的杨师傅对阵散打冠军,注:是高龄级别被冠军级别打败了,那如果是咏春冠军叶问呢?更别说是李小月神谭求男变女类的变身小说百度关键词工具如何通过百度官方工具提升关键词排名同一服务器网站一个服务器能运行多少个网站www.javmoo.comjavimdb怎么看bk乐乐BK乐乐和沈珂什么关系?henhenlu.com谁有大片地址呀 麻烦告诉我 谢谢啦 O会给你打满分的
香港服务器租用99idc 俄罗斯vps krypt siteground rak机房 网站监控 http500内部服务器错误 国内php空间 嘉洲服务器 太原联通测速平台 web服务器的架设 卡巴斯基免费试用 空间登录首页 阿里云官方网站 重庆电信服务器托管 iki 阿里云手机官网 闪讯网 xshell5注册码 腾讯云平台 更多