목록공부용 (11)
잡동사니 블로그
https://www.lgaimers.ai/ LG AI LG와 청년들이 함께 만드는 더 가치 있는 미래 www.lgaimers.ai 에서 진행하는 교육을 바탕으로 정리했습니다. Determinant(행렬식) & Eigenvalue(고윳값)에 대해서 Cholesky Decomposition(숄레스키 분해) Diagonalization (대각화), Singular value decomposition (특잇값 분해) 1. Matrix Decomposition #2*2 행렬 기준 import numpy as np def inverse_Matrix(A11,A12,A21,A22): A = np.array([[A11, A21], [A12, A22]]) if A[0][0]*A[1][1]-A[0][1]*A[1][0] !..
https://arxiv.org/abs/2012.06678 TabTransformer: Tabular Data Modeling Using Contextual Embeddings We propose TabTransformer, a novel deep tabular data modeling architecture for supervised and semi-supervised learning. The TabTransformer is built upon self-attention based Transformers. The Transformer layers transform the embeddings of categorical featu arxiv.org Abstract 지도 학습과 준지도 학습을 위한 새로운 t..
https://arxiv.org/abs/1908.07442 TabNet: Attentive Interpretable Tabular Learning We propose a novel high-performance and interpretable canonical deep tabular data learning architecture, TabNet. TabNet uses sequential attention to choose which features to reason from at each decision step, enabling interpretability and more efficient le arxiv.org Abstract Tabular data(정형데이터)에 적용이 가능하고 성능이 우수하며 동..
https://arxiv.org/abs/1602.04938 "Why Should I Trust You?": Explaining the Predictions of Any Classifier Despite widespread adoption, machine learning models remain mostly black boxes. Understanding the reasons behind predictions is, however, quite important in assessing trust, which is fundamental if one plans to take action based on a prediction, or when ch arxiv.org Abstract 대부분의 ML 모델은 Black..