Technology sharing

57. Classification in probabilistica reticularis neuralis (PNN) (matlab)

2024-07-12

한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina

1. Introductio ad genus secundum retis neural probabilistic (PNN)

PNN (Probabilistica Network neural, reticulum neural probabilisticum) exemplar retis neuralis secundum probabilitatem theoriae fundatum, maxime adhibitum est ad solvendas difficultates classificationes. PNN primum a Makovsky et Masikin anno 1993 proposita est. Valde efficax est algorithmus classificatio.

Principium PNN breviter perstringi potest ut sequentes gradus:

  1. Mauris accumsan input: Enter the input sample data in the model respectively.
  2. Exemplum tabulatum:Praestare exemplar matching in singulis input data et similitudinem ustulo inter eam et categoriam definitam computare.
  3. Exemplar comparationis strati:Compara similitudinem pereuntis omnium generum et categoriam cum summa score invenio sicut effectus finalis classificationis.

PNN notas habet:

  1. Efficientia: PNN celeritas disciplinae velocior habet ac accuratiorem classificationem altiorem in applicationibus practicis ostendit.
  2. robur;PNN validam fortitudinem ad strepitus et manes habet et efficaciter problemata classificationis implicata tractari potest.
  3. Facile explicare:Eventus PNN intuitive interpretari possunt, sino utentes ut melius intelligantur fundamentum classificationis exemplaris.

In genere, PNN est algorithmus efficax classificatio et apta ad problemata classificationis in variis campis, ut agnitio imaginis, classificationis textus, etc.

2. Classification descriptionem et functiones key secundum retis neurali probabilisticis (PNN)

1) Description

Tres initus binarii sunt vectores X et eorum adiuncti classes Tc.
Facere reticulum probabilisticum neural ad hos vectores recte inserere.

II) magna munera

newpnn () munus:Cogitans probabilis network neural

Probabilisticum Neural Network (PNN) est fundamentum radiale retiacula quae apta ad problemata classificationem.

grammatica

rete = newpnn(P,T, diffundens)% duos vel tres parametros accipit et novam neutralem probabilisticam redit.

modulus

P:r - Q matrix Q input vectors

T:s - Q matrix ex Q scopum genus vector

propagatio: Extensio fundationis radialis propagatae functionum (default = 0.1)

Si diffusio prope nulla est, reticulum agit ut proximus classificator proximus. Cum scalis magnus fit, retiaculis designatis plures vectores prope designatos considerat.

Sim () function: simulare neural network

grammatica

[Y,Xf,Af] = sim(net,X,Xi,Ai,T) 
modulus

rete: network

X: initus ad network

XI: conditio initialis input mora (default = 0)

Ai: iacuit initial mora conditio (default = 0)

T: network scopum (default = 0)

3. Data set et ostentationem

code

  1. X = [1 2; 2 2; 1 1]';
  2. Tc = [1 2 3];
  3. figure(1)
  4. plot(X(1,:),X(2,:),'.','markersize',30)
  5. for i = 1:3, text(X(1,i)+0.1,X(2,i),sprintf('class %g',Tc(i))), end
  6. axis([0 3 0 3])
  7. title('三个二元向量及分类')
  8. xlabel('X(1,:)')
  9. ylabel('X(2,:)')

Visum effectus

4d5fefa895b64a729f35c0ae3f875dfa.png

4. Probate retis initus vector in consilio fundatur

1) Description

Convoco scopum genus index T ad vector T *
NEWPNN utere ad designandum network neural y probabilistic
DIDO valorem 1 habet, quia haec distantia typica inter vectores initus est.

II) Test network

code

  1. T = ind2vec(Tc);
  2. spread = 1;
  3. net = newpnn(X,T,spread);
  4. %测试网络
  5. %基于输入向量测试网络。通过对网络进行仿真并将其向量输出转换为索引来实现目的。
  6. Y = net(X);
  7. Yc = vec2ind(Y);
  8. figure(2)
  9. plot(X(1,:),X(2,:),'.','markersize',30)
  10. axis([0 3 0 3])
  11. for i = 1:3,text(X(1,i)+0.1,X(2,i),sprintf('class %g',Yc(i))),end
  12. title('测试网络')
  13. xlabel('X(1,:)')
  14. ylabel('X(2,:)')

Visum effectus

99dcaabffb554913b0d153b807c6b7e2.png

III) nova test notitia network

code

  1. x = [2; 1.5];
  2. y = net(x);
  3. ac = vec2ind(y);
  4. hold on
  5. figure(3)
  6. plot(x(1),x(2),'.','markersize',30,'color',[1 0 0])
  7. text(x(1)+0.1,x(2),sprintf('class %g',ac))
  8. hold off
  9. title('新数据分类')
  10. xlabel('X(1,:) and x(1)')
  11. ylabel('X(2,:) and x(2)')

Visum effectus

03a46217cb4840db1d7a4da2b249be4.png

5. Retis neuralis probabilistica spatium in tria genera dividit.

illustrare

Dividitur in tria genera

code

  1. x1 = 0:.05:3;
  2. x2 = x1;
  3. [X1,X2] = meshgrid(x1,x2);
  4. xx = [X1(:) X2(:)]';
  5. yy = net(xx);
  6. yy = full(yy);
  7. m = mesh(X1,X2,reshape(yy(1,:),length(x1),length(x2)));
  8. m.FaceColor = [0 0.5 1];
  9. m.LineStyle = 'none';
  10. hold on
  11. m = mesh(X1,X2,reshape(yy(2,:),length(x1),length(x2)));
  12. m.FaceColor = [0 1.0 0.5];
  13. m.LineStyle = 'none';
  14. m = mesh(X1,X2,reshape(yy(3,:),length(x1),length(x2)));
  15. m.FaceColor = [0.5 0 1];
  16. m.LineStyle = 'none';
  17. plot3(X(1,:),X(2,:),[1 1 1]+0.1,'.','markersize',30)
  18. plot3(x(1),x(2),1.1,'.','markersize',30,'color',[1 0 0])
  19. hold off
  20. view(2)
  21. title('三分类')
  22. xlabel('X(1,:) and x(1)')
  23. ylabel('X(2,:) and x(2)')

experiri effectus

d9f88e5a4a194cdab4a7d4c903d77645.png

6. Libri

Retis neuralis probabilistica (PNN) est retis artificialis neuralis adhibita ad exemplar classificationis. Fundatur in theorematis Bayesis et exemplar mixtionis Gaussianae et adhiberi potest ad varias notitiarum rationes, inter continuas notitias et notitias discretas. PNN flexibilior est quam retiacula neural traditionalis, cum de classificatione problemata tractant, et facultates generaliores accuratius et generaliores habet.

Praecipuum opus PNN principium est computare similitudinem inter input notitias positas et unumquodque specimen in exemplo statuto, et inputa data secundum similitudinem indica. PNN consist of four layers: Input accumsan, velit accumsan, accumsan elit, ac accumsan elit. Input data primum ad exemplar tabulatum per input iacuit transivit, deinde similitudo per stratum competitionis computatur, et demum in strato in output secundum similitudinem indicatur.

In Matlab, instrumentorum instrumentorum pertinentibus uti potes vel programmate tuo ad classificationem PNN deducendi. Primum, notitias praepositos et probationes praeparare debes, ac deinde exemplar PNN instituendi per datas institutiones instituendi. Postquam disciplina peracta est, certa notitia probata potest aestimare classificationem exsecutionis PNN perpendere et praedicere.

Super, PNN potens est methodus classificationis apta variis quaestionibus classificationis accommodata. In applicationibus practicis, propriae notae et exemplar parametri seligi possunt secundum certas difficultates ad meliorem classificationem perficiendam. Corrige opes instrumentorum et subsidiorum functionis praebet, facilius efficiendi et applicandi PNN faciendi.

7. source code

code

  1. %% 基于概率神经网络(PNN)的分类(matlab)
  2. %此处有三个二元输入向量 X 和它们相关联的类 Tc。
  3. %创建 y 概率神经网络,对这些向量正确分类。
  4. %重要函数:NEWPNN 和 SIM 函数
  5. %% 数据集及显示
  6. X = [1 2; 2 2; 1 1]';
  7. Tc = [1 2 3];
  8. figure(1)
  9. plot(X(1,:),X(2,:),'.','markersize',30)
  10. for i = 1:3, text(X(1,i)+0.1,X(2,i),sprintf('class %g',Tc(i))), end
  11. axis([0 3 0 3])
  12. title('三个二元向量及分类')
  13. xlabel('X(1,:)')
  14. ylabel('X(2,:)')
  15. %% 基于设计输入向量测试网络
  16. %将目标类索引 Tc 转换为向量 T
  17. %用 NEWPNN 设计 y 概率神经网络
  18. % SPREAD 值 1,因为这是输入向量之间的 y 典型距离。
  19. T = ind2vec(Tc);
  20. spread = 1;
  21. net = newpnn(X,T,spread);
  22. %测试网络
  23. %基于输入向量测试网络。通过对网络进行仿真并将其向量输出转换为索引来实现目的。
  24. Y = net(X);
  25. Yc = vec2ind(Y);
  26. figure(2)
  27. plot(X(1,:),X(2,:),'.','markersize',30)
  28. axis([0 3 0 3])
  29. for i = 1:3,text(X(1,i)+0.1,X(2,i),sprintf('class %g',Yc(i))),end
  30. title('测试网络')
  31. xlabel('X(1,:)')
  32. ylabel('X(2,:)')
  33. %数据测试
  34. x = [2; 1.5];
  35. y = net(x);
  36. ac = vec2ind(y);
  37. hold on
  38. figure(3)
  39. plot(x(1),x(2),'.','markersize',30,'color',[1 0 0])
  40. text(x(1)+0.1,x(2),sprintf('class %g',ac))
  41. hold off
  42. title('新数据分类')
  43. xlabel('X(1,:) and x(1)')
  44. ylabel('X(2,:) and x(2)')
  45. %% 概率神经网络将输入空间分为三个类。
  46. x1 = 0:.05:3;
  47. x2 = x1;
  48. [X1,X2] = meshgrid(x1,x2);
  49. xx = [X1(:) X2(:)]';
  50. yy = net(xx);
  51. yy = full(yy);
  52. m = mesh(X1,X2,reshape(yy(1,:),length(x1),length(x2)));
  53. m.FaceColor = [0 0.5 1];
  54. m.LineStyle = 'none';
  55. hold on
  56. m = mesh(X1,X2,reshape(yy(2,:),length(x1),length(x2)));
  57. m.FaceColor = [0 1.0 0.5];
  58. m.LineStyle = 'none';
  59. m = mesh(X1,X2,reshape(yy(3,:),length(x1),length(x2)));
  60. m.FaceColor = [0.5 0 1];
  61. m.LineStyle = 'none';
  62. plot3(X(1,:),X(2,:),[1 1 1]+0.1,'.','markersize',30)
  63. plot3(x(1),x(2),1.1,'.','markersize',30,'color',[1 0 0])
  64. hold off
  65. view(2)
  66. title('三分类')
  67. xlabel('X(1,:) and x(1)')
  68. ylabel('X(2,:) and x(2)')