| 1 | 1/1 | 返回列表 |
| 查看: 644 | 回復(fù): 0 | ||
[求助]
matlab中的ID3算法,參數(shù)含義,求助!急~
|
|
以下是網(wǎng)上找的ID3算法,單看注釋實在看不懂參數(shù)含義。 我想盡快能把算法跑起來,有沒有好心人解釋一下params和region究竟是什么意思呢? 當(dāng)然如果有運行示例之類的就最好不過了。謝謝! PS:代碼中的笑臉應(yīng)該改為: 和 ) 。。。 function D = ID3(train_features, train_targets, params, region) % Classify using Quinlan's ID3 algorithm % Inputs: % features - Train features % targets - Train targets % params - [Number of bins for the data, Percentage of incorrectly assigned samples at a node] % region - Decision region vector: [-x x -y y number_of_points] % % Outputs % D - Decision sufrace [Ni, M] = size(train_features); %Get parameters [Nbins, inc_node] = process_params(params); inc_node = inc_node*M/100; %For the decision region N = region(5); mx = ones(N,1) * linspace (region(1),region(2),N); my = linspace (region(3),region(4),N)' * ones(1,N); flatxy = [mx( , my( ]';%Preprocessing [f, t, UW, m] = PCA(train_features, train_targets, Ni, region); train_features = UW * (train_features - m*ones(1,M));; flatxy = UW * (flatxy - m*ones(1,N^2));; %First, bin the data and the decision region data [H, binned_features]= high_histogram(train_features, Nbins, region); [H, binned_xy] = high_histogram(flatxy, Nbins, region); %Build the tree recursively disp('Building tree') tree = make_tree(binned_features, train_targets, inc_node, Nbins); %Make the decision region according to the tree disp('Building decision surface using the tree') targets = use_tree(binned_xy, 1:N^2, tree, Nbins, unique(train_targets)); D = reshape(targets,N,N); %END function targets = use_tree(features, indices, tree, Nbins, Uc) %Classify recursively using a tree targets = zeros(1, size(features,2)); if (size(features,1) == 1), %Only one dimension left, so work on it for i = 1:Nbins, in = indices(find(features(indices) == i)); if ~isempty(in), if isfinite(tree.child(i)), targets(in) = tree.child(i); else %No data was found in the training set for this bin, so choose it randomally n = 1 + floor(rand(1)*length(Uc)); targets(in) = Uc(n); end end end return end %This is not the last level of the tree, so: %First, find the dimension we are to work on dim = tree.split_dim; dims= find(~ismember(1:size(features,1), dim)); %And classify according to it for i = 1:Nbins, in = indices(find(features(dim, indices) == i)); targets = targets + use_tree(features(dims, , in, tree.child(i), Nbins, Uc);end %END use_tree function tree = make_tree(features, targets, inc_node, Nbins) %Build a tree recursively [Ni, L] = size(features); Uc = unique(targets); %When to stop: If the dimension is one or the number of examples is small if ((Ni == 1) | (inc_node > L)), %Compute the children non-recursively for i = 1:Nbins, tree.split_dim = 0; indices = find(features == i); if ~isempty(indices), if (length(unique(targets(indices))) == 1), tree.child(i) = targets(indices(1)); else H = hist(targets(indices), Uc); [m, T] = max(H); tree.child(i) = Uc(T); end else tree.child(i) = inf; end end return end %Compute the node's I for i = 1:Ni, Pnode(i) = length(find(targets == Uc(i))) / L; end Inode = -sum(Pnode.*log(Pnode)/log(2)); %For each dimension, compute the gain ratio impurity delta_Ib = zeros(1, Ni); P = zeros(length(Uc), Nbins); for i = 1:Ni, for j = 1:length(Uc), for k = 1:Nbins, indices = find((targets == Uc(j)) & (features(i, == k));P(j,k) = length(indices); end end Pk = sum(P); P = P/L; Pk = Pk/sum(Pk); info = sum(-P.*log(eps+P)/log(2)); delta_Ib(i) = (Inode-sum(Pk.*info))/-sum(Pk.*log(eps+Pk)/log(2)); end %Find the dimension minimizing delta_Ib [m, dim] = max(delta_Ib); %Split along the 'dim' dimension tree.split_dim = dim; dims = find(~ismember(1:Ni, dim)); for i = 1:Nbins, indices = find(features(dim, == i);tree.child(i) = make_tree(features(dims, indices), targets(indices), inc_node, Nbins); end |
| 1 | 1/1 | 返回列表 |
| 最具人氣熱帖推薦 [查看全部] | 作者 | 回/看 | 最后發(fā)表 | |
|---|---|---|---|---|
|
[考研] 一志愿北京理工大學(xué)本科211材料工程294求調(diào)劑 +6 | mikasa的圍巾 2026-03-28 | 6/300 |
|
|---|---|---|---|---|
|
[考研] 調(diào)劑求院校招收 +5 | 鶴鯨鴿 2026-03-28 | 5/250 |
|
|
[碩博家園] 求調(diào)劑 330分 085600材料與化工 +3 | gqhhh 2026-03-22 | 3/150 |
|
|
[考研] 2026年華南師范大學(xué)歡迎化學(xué),化工,生物,生醫(yī)工等專業(yè)優(yōu)秀學(xué)子加入! +3 | llss0711 2026-03-28 | 5/250 |
|
|
[考研] 數(shù)一英一271專碩(085401)求調(diào)劑,可跨 +7 | 前行必有光 2026-03-28 | 8/400 |
|
|
[考研] 一志愿華北電力大學(xué)能動專碩,293,求調(diào)劑 +3 | 15537177284 2026-03-23 | 5/250 |
|
|
[考研] 材料與化工272求調(diào)劑 +9 | 阿斯蒂芬2004 2026-03-28 | 9/450 |
|
|
[考研] 求調(diào)劑 +6 | 蘆lty 2026-03-25 | 7/350 |
|
|
[考研] 286求調(diào)劑 +12 | PolarBear11 2026-03-26 | 12/600 |
|
|
[考研] 085602 307分 求調(diào)劑 +7 | 不知道叫什么! 2026-03-26 | 7/350 |
|
|
[考研] 081200-314 +3 | LILIQQ 2026-03-27 | 4/200 |
|
|
[考研] 330一志愿中國海洋大學(xué) 化學(xué)工程 085602 有讀博意愿 求調(diào)劑 +3 | wywy.. 2026-03-27 | 4/200 |
|
|
[考研] 0856調(diào)劑 +5 | 求求讓我有書讀?/a> 2026-03-26 | 6/300 |
|
|
[考研] 一志愿華東理工大學(xué)081700,初試分數(shù)271 +6 | kotoko_ik 2026-03-23 | 7/350 |
|
|
[考研] 329求調(diào)劑 +7 | 鈕恩雪 2026-03-25 | 7/350 |
|
|
[考研] 生物學(xué) 296 求調(diào)劑 +4 | 朵朵- 2026-03-26 | 6/300 |
|
|
[考研] 【2026考研調(diào)劑】制藥工程 284分 求相關(guān)專業(yè)調(diào)劑名額 +4 | 袁奐奐 2026-03-25 | 8/400 |
|
|
[考研] 0854電子信息求調(diào)劑 324 +4 | Promise-jyl 2026-03-23 | 4/200 |
|
|
[考研] 292求調(diào)劑 +4 | 鵝鵝鵝額額額額?/a> 2026-03-24 | 4/200 |
|
|
[考研]
|
13659058978 2026-03-24 | 4/200 |
|