#author("2022-07-02T01:02:57+09:00","","") #author("2022-07-02T01:03:20+09:00","","") [[Pythonライブラリ]] 目次 #contents *教師あり学習 [#l60ab254] **サポートベクターマシン (SVM) [#za5289a5] -URL仮置き https://qiita.com/kazuki_hayakawa/items/18b7017da9a6f73eba77 https://qiita.com/Hirochon/items/12379d7ca6141f1fb6fa https://kenyu-life.com/2019/02/11/support_vector_machine/ https://qiita.com/renesisu727/items/964005bd29aa680ad82d https://techacademy.jp/magazine/34353 https://qiita.com/hiro88hyo/items/d17cb02b7356f07d16fb https://watlab-blog.com/2019/12/22/svm/ https://watlab-blog.com/2019/12/29/svr/ https://aizine.ai/svm-0902/ https://aizine.ai/svm-0703/ https://techacademy.jp/magazine/34353 https://www.pc-koubou.jp/magazine/22439 https://aizine.ai/python-svm0925/ https://www.datacamp.com/tutorial/svm-classification-scikit-learn-python https://data-flair.training/blogs/svm-support-vector-machine-tutorial/ https://scikit-learn.org/stable/modules/classes.html#module-sklearn.svm https://scikit-learn.org/stable/modules/svm.html#svm **決定木分析 [#ycf0cfb2] -URL仮置き https://blog.kikagaku.co.jp/decision-tree-visualization https://cacoo.com/ja/blog/what-is-decision-tree/ https://qiita.com/3000manJPY/items/ef7495960f472ec14377 *ランダムフォレスト [#gba9571e] **ランダムフォレスト [#gba9571e] -URL仮置き https://qiita.com/renesisu727/items/09d9e88ab4e14ab034ed https://qiita.com/mshinoda88/items/8bfe0b540b35437296bd https://mathwords.net/randomforest https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html **ナイーブベイズ [#hec1d064] -URL仮置き https://qiita.com/ynakayama/items/ca3f5e9d762bbd50ad1f https://di-acc2.com/programming/python/8709/ **単回帰分析、重回帰分析 [#ae929df8] -URL仮置き https://aiacademy.jp/media/?p=236 https://www.albert2005.co.jp/knowledge/statistics_analysis/multivariate_analysis/single_regression https://www.albert2005.co.jp/knowledge/statistics_analysis/multivariate_analysis/multiple_regression **ロジスティック回帰 [#w22d37a9] -URL仮置き https://aizine.ai/logistic-regression0810/ https://navaclass.com/logistic-regression/ https://udemy.benesse.co.jp/data-science/data-analysis/logistic-regression-analysis.html **k-近傍法(kNN) [#ef712358] -URL仮置き https://qiita.com/fujin/items/128ed7188f7e7df74f2c https://qiita.com/oirom/items/22ccb7c0139dce925f43 http://labs.eecs.tottori-u.ac.jp/sd/Member/oyamada/OpenCV/html/py_tutorials/py_ml/py_knn/py_knn_understanding/py_knn_understanding.html https://zenn.dev/kumamoto/articles/bc6230323bc0ad https://scikit-learn.org/stable/modules/neighbors.html **多層パーセプトロン(MLP) [#kbe1661d] -URL仮置き https://qiita.com/maskot1977/items/d0253e1eab1ff1315dff https://scikit-learn.org/stable/modules/neural_networks_supervised.html https://aizine.ai/glossary-mlp/ https://rightcode.co.jp/blog/information-technology/multilayer-perceptron-implementation *教師なし学習 [#vfd9eca8] **主成分分析 [#s6442221] -URL仮置き http://www.math.keio.ac.jp/~kei/GDS/2nd/pca.html https://tech-clips.com/principal-component-analysis-using-python https://recruit.cct-inc.co.jp/tecblog/machine-learning/pca-kaisetsu/ https://qiita.com/maskot1977/items/082557fcda78c4cdb41f https://santakalog.com/2021/02/13/python-pca/ https://www.samoariblog.com/2021/04/python-pca.html https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.PCA.html?highlight=pca#sklearn.decomposition.PCA **K-means法 [#ydabe5c1] -URL仮置き https://ebi-works.com/k-means/ https://techacademy.jp/magazine/28780 https://laid-back-scientist.com/k-means https://di-acc2.com/programming/python/4235/ https://qiita.com/g-k/items/0d5d22a12a4507ecbf11 https://www.albert2005.co.jp/knowledge/data_mining/cluster/non-hierarchical_clustering https://watlab-blog.com/2020/01/19/k-means/ **階層型クラスタリング [#x812bf47] -URL仮置き https://scikit-learn.org/stable/modules/clustering.html#hierarchical-clustering https://data-analysis-stats.jp/%E6%A9%9F%E6%A2%B0%E5%AD%A6%E7%BF%92/scikit-learn%E3%82%92%E7%94%A8%E3%81%84%E3%81%9F%E9%9A%8E%E5%B1%A4%E7%9A%84%E3%82%AF%E3%83%A9%E3%82%B9%E3%82%BF%E3%83%AA%E3%83%B3%E3%82%B0-hierarchical-clustering%E3%81%AE%E8%A7%A3%E8%AA%AC/