** **

Preprint

o
P. Wang, Y. Lei, Y. Ying and H.
Zhang. Differential private SGD with
non-smooth losses.* *To appear in* Applied and Computational
Harmonic Analysis (ACHA), *2022.

2021

o Z. Yang, Y. Lei, P. Wang, T. Yang, and Y. Ying. Simple stochastic and online gradient descent for pairwise
learning. *Advances in Neural Information Processing
Systems (NeurIPS)*, 2021

o
Y.
Lei, M. Liu, and Y. Ying. Generalization
guarantees of SGD for pairwise learning. *Advances in Neural Information
Processing Systems (NeurIPS)*, 2021.

o
M. Natole, Y. Ying, A. Buyantuev, M. Stessin, V. Buyantuev, and A. Lapenas. Patterns of Forest Fires will become Less
Predictable with Climate Warming.* To appear in Environmental
Advances, *2021.

o
Y. Lei and Y. Ying. Stochastic Proximal AUC Maximization. *Journal of Machine
Learning Research*, 2021.

o
Y. Lei and Y. Ying. Sharper generalization bounds
for learning with gradient-dominated objective functions. *International Conference on Learning Representations (ICLR)*, 2021.

o
Z. Yang, Y. Lei, S. Lyu
and Y. Ying. Stability and
differential privacy of stochastic gradient descent for pairwise learning with
non-smooth loss. *International
Conference on Artificial Intelligence and Statistics (AISTATS)*, 2021.

o
H. Sapkota, Y. Ying, F, Chen and Q. Yu. Distributionally robust optimization for deep kernel
multiple instance learning. *International Conference on
Artificial Intelligence and Statistics (AISTATS)*, 2021.

2020

o
S. Hu, Y. Ying, X. Wang and S. Lyu. Learning with Minimizing the Sum of Ranked Range. *Advances in Neural Information Processing Systems (NeurIPS)*, 2020

o
B. Zhou, Y. Ying and S. Skiena.
Online AUC Optimization for Sparse
High-Dimensional Datasets. *International
Conference on Data Mining (ICDM)*, 2020.

o
Z. Yang, B. Zhou, Y. Lei, and Y. Ying. Stochastic
Hard Thresholding Algorithms for AUC Maximization. *International Conference on
Data Mining (ICDM)*,
2020.

o
Y.
Lei and Y. Ying. Fine-Grained
analysis of stability and generalization for SGD. *International
Conference on Machine Learning (ICML)*, 2020. Talk Slides.

o
W.
Shen, Z. Yang, Y. Ying and X. Yuan. Stability and optimization error of stochastic gradient descent for
pairwise learning. To appear in *Analysis and Applications*,
2020.

o
Z.
Yang, W. Shen, Y. Ying and X. Yuan. Stochastic AUC optimization with general loss. *Communications
on Pure and Applied Analysis*, 19(8): 4191-4212, 2020.

o
Y. Feng and Y. Ying. Learning with correntropy-induced losses for regression with mixture of
symmetric stable noise. *Applied
and Computational Harmonic Analysis (ACHA)*, 48:795-810, 2020.

2019

o
B. Zhou, F. Chen and Y. Ying. Stochastic
Iterative Hard Thresholding for Graph-structured Sparsity Optimization. In *Thirty-sixth
International Conference on Machine Learning* (ICML), Long Beach, CA, 2019.

o
B. Zhou, F. Chen and Y. Ying. Dual Averaging Method for
Online Graph-structured Sparsity. *ACM SIGKDD Conference
on Knowledge Discovery and Data Mining *(KDD), 2019.

2018

o Q. Fang, M. Xu and Y. Ying. Faster convergence of a randomized coordinate descent method for
linearly constrained optimization problems. *Analysis and
Applications*, 16(5): 741-755, 2018.

o
M. Natole Jr, Y. Ying and S. Lyu. Stochastic
proximal algorithms for AUC maximization. *International
Conference on Machine Learning (ICML), 2018.*

o
S. Lyu and Y. Ying. A univariate bound of
area under ROC. *International Conference on Uncertainty in
Artificial Intelligence (UAI), Montery Bay, CA*,
2018

o
Y.
Wei, M.-C. Chang, Y. Ying, S. Lim, and S. Lyu.** **Explain black- box image classifications using superpixel-based
interpretation. *International Conference on Pattern
Recognition (ICPR)*, *Beijing, China*, 2018.

o
J Bohne Y. Ying, S. Gentric, and M.
Pontil. Learning local metrics from pairwise similarity data, *Pattern
Recognition*, 75: 315-326, 2018.

2017

o
Y.
Ying and D.X. Zhou. Unregularized
online learning algorithms with general loss functions. *Applied
and Computational Harmonic Analysis (ACHA), *42(2) 224-244, 2017.

o
Z. C. Guo, Y. Ying, and D. X. Zhou. Online regularized
learning with pairwise loss functions. To appear in *Advances in
Computational Mathematics*, 2017.

2016

o
Y.
Ying, L. Wen and S. Lyu. Stochastic online AUC maximization. *Advances in Neural
Information Processing Systems (NIPS)*, 2016. (Oral presentation) MATLAB Code
Video presentation

o Y. Ying and D.X. Zhou. Online Pairwise Learning Algorithms. *Neural Computation, *28:
743-777, 2016.

o
Q.
Cao, Z. C. Guo and Y. Ying, Generalization bounds for metric and similarity learning. *Machine
Learning Journal*, 102(1) 115-132, 2016.

o
M. Boissier, S. Lyu, Y. Ying, and
D.-X. Zhou. Fast convergence of online pairwise learning algorithms. *International
Conference on Artificial Intelligence and Statistics (AISTATS)*, 2016.

o
X.
Wang, M.C. Chang, Y. Ying, and S. Lyu. Co-Regularized PLSA for multi-modal learning. *The
Thirtieth AAAI Conference on Artificial Intelligence (AAAI)*, 2016.

2015

o F. Guest, R. Everson, Y. Ying
and D. Huang. Towards automatic prediction of tumor growth from CT images using
machine learning algorithms - a feasibility study. *European
Cancer Congress*, 2015.

o
Y.
Lei and Y. Ying. Generalization analysis for multi-modal metric learning. To
appear in *Analysis and Applications*, 2015. Online Ready.

o
M. Rogers,
C. Campbell and Y. Ying. Probabilistic
inference of biological networks via data integration. *BioMed
Research International*, Article ID: 707453, 2015.

2014

o
J. Bohne, Y. Ying, S. Gentric and M.
Pontil, Large margin local metric learning . *European
Conference on Computer Vision (ECCV)*, 2014.

o
Z. C.
Guo and Y. Ying, Guaranteed
classification via regularised similarity learning.
*Neural Computation *, Vol. 26(3), 2014.

2013

o
Q.
Cao, Y. Ying and Peng Li, Similarity metric learning for face recognition. *IEEE
International Conference on Computer Vision (ICCV)*, 2013.

o
Q.
Cao, Z. C. Guo and Y. Ying, Generalization bounds for metric and similarity learning. *arXiv** preprint, version 2 (under revision for
Machine Learning Journal) *, 2013.

2012

o
Y.
Ying, Multi-task coordinate gradient learning. *ICML workshop
on "object, functional and structured data: towards next generation
kernel-based methods", *2012.

o
Q.
Cao, Y. Ying and P. Li, Distance
metric learning revisited. *ECML-PKDD *, 2012. (version 1,
April 2012)

o
Y.
Ying and P. Li, Distance
metric learning with eigenvalue optimization. *Journal of Machine
Learning Research (Special topics on kernel and metric learning) *, 13:1-26,
2012.

2011

o
C.
Campbell and Y. Ying, Learning with Support Vector Machines. *Morgan &
Claypool Publishers*, 2011.

o
Y. Ying
and K. Huang and C. Campbell, Distance
metric learning with sparse regularization. *Technical Report,
University of Exeter*, September, 2010.

2010

o
Y.
Ying, Q. Wu and C. Campbell, Learning
the coordinate gradients, to appear in *Advances in Computational
Mathematics*, 2010.

o
Y.
Ying and C. Campbell, Rademacher chaos complexity for learning the kernel, *Neural
Computation*, Vol. 22(11), 2010. (version 1, October 2008)

This second reversion is a substantial extension of the COLT (2009) conference paper: Generalization bounds for learning the kernel. In particular, we provided a

self-contained proof for bounding the Rademacher chaos complexity by metric entropy integrals and also corrected the inaccurate claim on generalization bounds derived

from the covering number approach (appeared at the end of Section 3 of the COLT conference version).

o
K.
Huang, Y. Ying and C. Campbell, Generalized
sparse metric learning with relative comparisons, *Journal of Knowledge
and Information Systems (KAIS)*, 2010.

2009

o
K.
Huang, Y. Ying and C. Campbell, GSML:
A unified framework for sparse metric learning, *IEEE
International Conference on Data Mining (ICDM), 2009.** *

o
Y.
Ying, K. Huang and C. Campbell, Sparse metric
learning via smooth optimization, *Advances in Neural Information
Processing Systems (NIPS), *2009.

o
Y.
Ying, C. Campbell and M. Girolami, Analysis of SVM with indefinite kernels, *Advances in
Neural Information Processing Systems (NIPS), *2009. spotlight
slide MATLAB code (version 1)

o Y. Ying, K. Huang and C.
Campbell, Enhanced protein fold recognition through a novel data integration
approach, *BMC Bioinformatics* (Open access),
(2009) 10:267.

o
See
its short paper Information
theoretic kernel integration , *NIPS workshop on Learning from
multiple sources, 2009.* Presentation

o
Y.
Ying and C. Campbell, Generalization bounds for learning the kernel, *Proceedings
of the 22nd Annual Conference on Learning Theory (COLT)*, 2009.

2008

o
Y. Ying,
C. Campbell, T. Damoulas, and M. Girolami,
Class prediction from
disparate biological data sources using a simple multi-class multi-kernel
algorithm, In *4th IAPR International Conference on Pattern Recognition
in Bioinformatics*, 2009. (Was preprint, 2008)

o
T. Damoulas, Y. Ying, M. Girolami,
and C. Campbell, Inferring sparse kernel combination and relevance vectors: an
application to subcelluar localization of proteins ,
*International Conference on Machine Learning and Applications (ICMLA)*,
2008.

o
Y.
Ying and C. Campbell, Learning coordinate gradients with multi-task kernels , *Proceedings
of the 21st Annual Conference on Learning Theory (COLT)*, 2008. MATLAB code
available under request.

o
P. Agius, Y. Ying and C. Campbell, Bayesian
unsupervised learning with multiple data types , *Statistical
Applications in Genetics and Molecular Biology*, Vol. 8: Iss.
1, 2009. (Was Technical Report, January, 2008)

o
P.
Li, Y.Ying and C. Campbell, A
variational approach to semi-supervised clustering, *Proceedings
of ESSAN *, 2009. (Was preprint, 2008)

2007

o
Y.
Ying, P. Li and C. Campbell, A marginalized variational Bayesian approach to the analysis of array
data, *BMC Proceedings of the Workshop on Machine Learning in
Systems Biology*, 2007.

o
Argyriou, C. A. Micchelli, M. Pontil, and Y. Ying, A spectral
regularization framework for multi-task structure learning, *Advances
in Neural Information Processing Systems (NIPS)*, 2007.

o
Caponnetto, C. A. Micchelli, M. Pontil, and Y. Ying, Universal
multi-task kernels, *Journal of Machine Learning Research*, 9
(2008), 1615-1646. (Was Technical Report, University College London, December
2006)

Characterization of universal matrixed-valued (and more general operator-valued) kernels. These characterizations are highlighted with numerous examples of paractical importance in multi-task learning.

2006

o
M.
Pontil and Y. Ying, Online gradient descent learning algorithms, *Foundations
of Computational Mathematics*, 5 (2008), 561-596. (Was Technical Report,
University College London, 2005)

o Generalization analysis of Online Stochastic Gradient Descent algorithms in reproducing kernel Hilbert space. In particular, we show that their error rates are competitive with offline regularization algorithms.

o
Y.
Ying and D.X. Zhou, Online regularized classification algorithms, *IEEE
Trans. Inform. Theory (regular paper)*, 11 (2006), 4775-4788.

2005

o
Y.
Ying and D.X. Zhou, Learnability of Gaussians with flexible variances , *Journal
of Machine Learning Research*, 8 (2007), 249-276. (Was Technical Report,
City University of Hong Kong, 2004)

Characterization of statistical consistency for learning the kernel algorithms by the V-gamma dimension of the set of candidate kernels.

o
Q.
Wu, Y. Ying and D.X. Zhou, Multi-kernel
regularized classifiers, *Journal of Complexity*, 2006. (Was
preprint, 2004)

o
Y.
Ying, Convergence
analysis of online algorithms, *Advances in Computational
Mathematics*, 27 (2007), 273-291. (Was preprint, City University of Hong
Kong, 2005).

o
Q. Wu,
Y. Ying and D.X. Zhou, Learning rates of least-square regularized regression, *Foundations
of Computational Mathematics*, 6 (2006), 171-192.

2004

o
Y.
Ying, McDiarmid's
inequalities of Bernstein and Bennett forms, *Technical Report,
City University of Hong Kong*, 2004.

o
Q.
Wu, Y. Ying and D.X. Zhou, Learning theory: from regression to classification, *Topics
in Multivariate Approximation and Interpolation, K.Jetter
et.al., Editors*, (2004), 101--134.

o
D.R.
Chen, Q. Wu, Y. Ying and D.X. Zhou, Support vector
machine soft margin classifiers: error analysis, *Journal of
Machine Learning Research*, 5 (2004).

Return to Yiming Ying's home page.