decision tree 4

의사결정 트리(Decision Tree) - 피쳐 중요도(Feature Importance) 측정

2023.09.04 - [AI,머신러닝] - XGBoost 피쳐 중요도 https://medium.com/the-artificial-impostor/feature-importance-measures-for-tree-models-part-i-47f187c1a2c3 Feature Importance Measures for Tree Models — Part I An Incomplete Review medium.com https://mljar.com/blog/feature-importance-in-random-forest/ Random Forest Feature Importance Computed in 3 Ways with Python The feature importance (variable importanc..

AI,머신러닝 2023.09.01

의사결정 트리(Decision Tree) - C4.5 알고리즘

https://levelup.gitconnected.com/c4-5-decision-tree-explained-from-bottom-up-67468c1619a7 C4.5 Decision Tree. Explained from bottom up C4.5 Decision Tree is a complicated Algorithm to understand. It does require a lot of background knowledge. This blog has tried to collate… levelup.gitconnected.com https://tyami.github.io/machine%20learning/decision-tree-3-c4_5/ 의사결정 나무 (Decision Tree) C4.5 알고리즘..

AI,머신러닝 2023.08.24

의사결정 트리 (Decision Tree)

https://www.ibm.com/topics/decision-trees What is a Decision Tree | IBM Learn the pros and cons of using decision trees for data mining and knowledge discovery tasks www.ibm.com 트리 노드 분할 Decision tree 노드 분할 시 최적의 피쳐를 고르는 대표적인 방법으로는 정보 이득(information gain)과 지니 불순도(Gini impurity)가 있다. 정보 이득 (Information Gain) 정보 이득은 보통 노드 분할 전후의 엔트로피 차이를 뜻하지만, 엔트로피 대신 지니 불순도나 평균 제곱 오차(mean squared error) 등을 사용할 때에..

AI,머신러닝 2023.08.24