Paper 1 : Distillation as a Defense to Adversarial Perturbations against Deep Neural Networks Paper 2 : Towards Evaluating the Robustness of Neural Networks
2023.04.14
김태욱
Topic: Density peaks clustering using geodesic distances Keywords: Data clustering Density peaks clustering Geodesic distances Reference: Du, M., Din…
2023.03.31
김철규
Topic: VAE-BRIDGE: Variational Autoencoder Filter for Bayesian Ridge Imputation of Missing Data keywords: missing data variational autoencoder bayesian r…
이형권
Topic: Physics–informed Bayesian optimization Keywords: Physics-informed machine learning Gaussian process Bayesian optimization Structured probabilisti…
김주헌
Keyword : Object detection, Non-maximum suppression, Bounding box regression
2023.03.24
GOGGLE_generative_modeling_for_tabular_data_by_learning_relational_structure
한민석
Topic: Constrained Monotonic Neural Networks Keywords: interpretable deep learning, Model generalization Neural network, Monotonicity, Reference: …
2023-03-17
김현호
Topic: RIFS: a randomly restarted incremental feature selection algorithm keywords: RIFS RIFS2D Feature selection Reference: Ye, Y., Zhang, R., Zheng…
2023.02.15
Topic: Image Super-Resolution Using Deep Convolutional Networks Keywords: Super-resolution Deep convolutional neural networks Sparse coding Reference:…
Topic: Similarity of Neural Network Representations Revisited Keywords: Neural Network Representation Similarity Reference: 1. Kornblith, Simon, …
2023-02-01
Learning loss for active learning CVPR 2019
2023.02.08
Learning physics constrained dynamic using autoencoder NeurIPS 2022
Paper review 1. Distilling the Knowledge in a Neural Network (G Hinton et al, 2014 NIPS) 2. Self-training with Noisy Student improves ImageNet classification…
2023.01.18
Topic: Clustering using silhouette coefficients Keywords: Cluster analysis Cluster validity index Silhouette coefficient Objective function clustering …
Topic: Feedback loop in machine learning Keywords: Feedback loop Recommender system Control system Reference: A. Khritankov, "Hidden Feedback Loops i…
2023.01.03