Copyright notice:
The material made available below is presented to ensure timely dissemination of
scholarly and technical work and may differ to some extent from the final
published versions.
Copyright and all rights therein are retained by authors or by other copyright holders.
|
Preprints
- O. Zadorozhnyi, G. Blanchard, A. Carpentier.
Restless dependent bandits with fading memory.
[arXiv]
- F. Göbel, G. Blanchard.
Volume Doubling Condition and a Local Poincaré Inequality on Unweighted Random Geometric Graphs.
[arXiv]
- G. Blanchard, P. Mathé, N. Mücke.
Lepskii Principle in Supervised Learning.
[arXiv]
- Abhishake Rastogi, G. Blanchard, P. Mathé.
Convergence analysis of Tikhonov regularization for non-linear
statistical inverse learning problems.
[arXiv]
- G. Durand, G. Blanchard, P. Neuvial, E. Roquain.
Post hoc false positive control for spatially structured hypotheses.
[arXiv]
- G. Blanchard, A. Deshmukh, U. Dogan, G. Lee, C. Scott.
Domain Generalization by Marginal Transfer Learning.
[arXiv]
- R. Gribonval, G. Blanchard, N. Keriven, Y. Traonmilin.
Compressive Statistical Learning with Random Feature Moments.
[arXiv]
- G. Blanchard, P. Neuvial, E. Roquain.
Post hoc inference via joint family-wise error rate control.
[arXiv]
- G. Blanchard, N. Mücke
Kernel regression, minimax rates and effective dimensionality: beyond the regular case.
[arXiv]
|
Journal papers, Book sections (Published/To appear)
- G. Blanchard, O. Zadorozhnyi.
Concentration of weakly dependent Banach-valued sums and applications to kernel learning methods.
Bernoulli, 2019.
[arXiv Version]
- J. Katz-Samuels, G. Blanchard, C. Scott.
Decontamination of Mutual Contamination Models.
Journal of Machine Learning Research, 2019.
[arXiv]
- G. Blanchard, N. Mücke.
Parallelizing Spectral Algorithms for Kernel Learning.
Journal of Machine Learning Research (30):1-29, 2018.
[JMLR]
- G. Blanchard, A. Carpentier, M. Gutzeit.
Minimax Euclidean Separation Rates for Testing Convex Hypotheses in Rd.
Electron. J. Statist. 12 (2): 3713-3735, 2018.
[Project Euclid Open Access]
- F. Bachoc, G. Blanchard, P. Neuvial.
On the post selection inference constant under restricted isometry properties.
Electron. J. Statist. 12(2): 3736-3757, 2018.
[DOI, Project Euclid Open Access]
- G. Blanchard, M. Hoffmann, M. Reiß.
Early stopping for statistical inverse problems via truncated SVD estimation.
Electron. J. Statist. 12 (2):3204-3231, 2018.
[DOI, Project Euclid Open Access]
- G. Blanchard, M. Hoffmann, M. Reiß.
Optimal adaptation for early stopping in statistical inverse problems.
SIAM/ASA Journal on Uncertainty Quantification 6(3): 1043-1075, 2018.
[DOI]
[arXiv Version]
- G. Blanchard, F. Göbel, U. von Luxburg.
Construction of Tight Frames on Graphs and Application to Denoising.
In Handbook of Big Data Analytics, Härdle, W., Lu, H.-S. and Xen, S. editors,
Chapter 20, pp. 503-522, Springer, 2018.
[DOI]
[arXiv Version]
- G. Blanchard, N. Mücke.
Optimal Rates For Regularization Of Statistical Inverse Learning Problems.
Foundations of Computational Mathematics 18 (4): 971-1013, 2018 (first online: 2017).
[DOI]
[arXiv Version]
- G. Blanchard, N. Krämer.
Convergence rates of Kernel Conjugate Gradient for
random design regression.
Analysis and Applications 14 (6): 763-794, 2016.
[DOI]
[arXiv Version]
- B. Mieth, M. Kloft, J. A. Rodríguez, S. Sonnenburg, R. Vobruba, C. Morcillo-Suárez, X. Farré, U.M. Marigorta, E. Fehr, T. Dickhaus, G. Blanchard, D. Schunk, A. Navarro & K.-R. Müller.
Combining Multiple Hypothesis Testing with Machine Learning Increases the Statistical Power of Genome-wide Association Studies.
Scientific Reports 6: 36671, 2016.
[DOI, Nature Open Access]
- G. Blanchard, M. Flaska, G. Handy, S. Pozzi, C. Scott.
Classification with
asymmetric label noise: Consistency and maximal denoising.
Electronic Journal of Statistics 10 (2): 2780-2824, 2016.
[Project Euclid (CCAL License)]
With corrigendum:
Electronic Journal of Statistics 12 (1): 1779-1781, 2018.
[Project Euclid (CCAL License)]
- A. Beinrucker, Ü. Dogan, G. Blanchard.
Extensions of stability selection using subsamples of observations and covariates.
Statistics and Computing 26: 1059-1077, 2016 (First online 2015).
[DOI]
[arXiv version]
- G. Blanchard, S. Delattre, E. Roquain.
Testing over a Continuum of Null Hypotheses with False Discovery Rate Control.
Bernoulli 20(1): 304-333, 2014.
[DOI, Open Access]
- G. Blanchard, T. Dickhaus, E. Roquain, F. Villers .
On least favorable configurations for step-up-down tests.
Statistica Sinica 24(1): 1-23, 2014.
[DOI]
[Supplement]
[arXiv version]
- G. Blanchard, P. Mathé.
Discrepancy Principle for Statistical Inverse Problems with Application to Conjugate Gradient Iteration.
Inverse Problems 28 (11): 115011, 2012.
[DOI]
[preprint version]
- M. Kloft, G. Blanchard.
The Local Rademacher Complexity of lp-Norm Multiple Kernel Learning.
Journal of Machine Learning Research 3: 2465-2502, 2012.
[JMLR]
[arXiv version]
- G. Blanchard, P. Mathé.
Conjugate gradient regularization under general smoothness and noise
assumptions. Journal of Inverse and Ill-posed Problems 18(6):
701-726, 2010.
[DOI]
- G. Blanchard, G. Lee, C. Scott.
Semi-Supervised Novelty Detection. Journal of Machine Learning Research
11(Nov): 2973-3009, 2010.
[JMLR]
- S. Arlot, G. Blanchard, E. Roquain.
Some non-asymptotic results on resampling in high dimension, I:
Confidence regions. Annals of
Statistics 38(1): 51-82, 2010.
[arXiv version]
- S. Arlot, G. Blanchard, E. Roquain.
Some non-asymptotic results on resampling in high dimension, II:
Multiple tests. Annals of
Statistics 38(1): 83-99, 2010.
[arXiv version]
- G. Blanchard, E. Roquain.
Adaptive FDR control under independence and dependence.
Journal of Machine Learning Research 10:2837-2871, 2009.
[abs]
[pdf] (JMLR site)
- A. Schwaighofer, T. Schröter, S. Mika,
G. Blanchard.
How wrong can we get? A review of machine learning approaches and error bars.
Combinatorial Chemistry & High Throughput Screening , 12 (5):
453-468, 2009.
[DOI]
- G. Blanchard, E. Roquain.
Two simple sufficient conditions for FDR control. Electronic Journal of
Statistics, 2: 963-992, 2008.
[Project Euclid Open Access]
- G. Blanchard, L. Zwald.
Finite dimensional projection for classification and
statistical learning. IEEE transactions on Information Theory, 54
(9): 4169-4182, 2008.
[DOI]
- G. Blanchard, O. Bousquet,
P. Massart. Statistical Performance of Support Vector
Machines.
Annals of Statistics, 36 (2): 489-531, 2008.
[arXiv version]
- M. Sugiyama, M. Kawanabe, G. Blanchard,
K.-R. Müller. Approximating the Best Linear
Unbiased Estimator of Non-Gaussian Signals with Gaussian Noise.
IEICE Transactions on Information and Systems,
E91-D (5): 1577-1580, 2008.
- G. Blanchard, C. Schäfer,
Y. Rozenholc, K-R. Müller. Optimal Dyadic
Decision Trees. Machine Learning, 66(2-3): 209-242, 2007.
- M. Kawanabe, M. Sugiyama, G. Blanchard, K-R. Müller.
A new algorithm of non-Gaussian component analysis with
radial kernel functions.
Annals of the Institute of Statistical Mathematics, 59(1):57-75,
2007.
- G. Blanchard, O. Bousquet,
L. Zwald. Statistical Properties of Kernel Principal
Component Analysis. Machine Learning, 66(2-3): 259-294, 2007.
- G. Blanchard, P. Massart.
Discussion of V.Koltchinskii's 2004 IMS Medallion Lecture
paper, "Local Rademacher complexities and oracle inequalities
in risk minimization". Annals of
Statistics , 34(6), 2006.
[arXiv version]
- G. Blanchard, M. Kawanabe,
M. Sugiyama, V. Spokoiny,
K.-R. Müller.
In search of non-Gaussian components
of a high-dimensional distribution.
Journal of Machine Learning Research, 7:247-282, 2006.
[abs]
[pdf] (JMLR site)
- G. Blanchard, D. Geman.
Hierarchical testing designs for pattern recognition.
Annals of Statistics, 33(3):1155-1202, 2005. (This is a shortened and
revised version of the technical report below).
[arXiv version]
- G. Blanchard. Different paradigms for
choosing sequential reweighting algorithms. Neural Computation,
16:811-836, 2004.
- G. Blanchard, B. Blankertz. BCI
competition 2003 - data set IIa: Spatial patterns of self-controlled
brain rhythm modulations. IEEE Trans. Biomed. Eng.,
51(6):1062-1066, 2004.
- G. Blanchard. Un algorithme
accéléré d'échantillonnage Bayésien
pour le modèle CART. Revue d'Intelligence artificielle,
18(3):383-410, 2004.
English version (somewhat outdated): A new algorithm for MCMC bayesian CART sampling.
[gzipped
ps]
- G. Blanchard, G. Lugosi, N. Vayatis.
On the Rate of Convergence of Regularized Boosting Classifiers. Journal
of Machine Learning Research (Special issue on learning theory),
4:861-894, 2003.
[JMLR
page for this issue]
- G. Blanchard. Generalization error
bounds for aggregate classifiers. In Nonlinear Estimation and
Classification, Denison, D. D. ,
Hansen, M. , Holmes, C. C. , Mallick, B.
and Yu, B . editors, Lectures notes in Statistics (171),
Springer, 357-368, 2003.
- G. Blanchard, M. Olsen. Le
système des renvois dans l'Encyclopédie : une
cartographie des structures de connaissance au XVIIIème
siècle. Recherches sur Diderot et l'Encyclopédie,
45-70, 2002.
[Site of RDE, full PDF]
- G. Blanchard. The "progressive
mixture" estimator for regression trees. Annales de l'I.H.P.,
35(6):793-820, 1999.
[NUMDAM link]
- G. Blanchard. L'estimateur de
«mélange progressif» appliqué aux arbres de
décision. C.R.A.S.,328, Série I:925-928,
1999.
|
Conference Proceedings (peer
reviewed)
- J. Achdou, J. Lam, A. Carpentier, G. Blanchard
A minimax near-optimal algorithm for adaptive rejection sampling.
Algorithmic Learning Theory (ALT 2019).
[arXiv Version]
- I. Tolstikhin, N. Zhivotovskiy, G. Blanchard
Permutational Rademacher Complexity.
Algorithmic Learning Theory (Proc. ALT 2015), Springer Lecture Notes in Artificial Intelligence (9355), 209-223, 2015.
[arXiv Version]
- S. Kurras, U. von Luxburg, G. Blanchard
The f-Adjusted Graph Laplacian: a Diagonal Modification with a Geometric Interpretation.
Proc. ICML 2014, JMLR Workshop and
Conference Proceedings 32:1530-1538, 2014.
[pdf]
[supplementary]
- I. Tolstikhin, G. Blanchard, M. Kloft
Localized Complexities for Transductive Learning.
Proc. COLT 2014, JMLR Workshop and
Conference Proceedings 35: 857-884, 2014.
[pdf]
- G. Blanchard, C. Scott.
Decontamination of Mutually Contaminated Models.
Proc. AISTATS 2014, JMLR Workshop and
Conference Proceedings 33:1-9, 2014.
[pdf]
[supplementary]
- C. Scott, G. Blanchard, G. Handy.
Classification with Asymmetric Label Noise: Consistency and Maximal Denoising.
Proc. Conf. on Learning Theory (COLT 2013),
JMLR Workshop and Conference Proceedings 30:489-511, 2013.
[pdf]
- R. Martinez-Noriega, A. Roumy, G. Blanchard.
Exemplar-Based Image Inpainting:
Fast Priority And Coherent Nearest Neighbor Search.
IEEE International Workshop on
Machine Learning for Signal Processing (MLSP 2012) , 2012.
- A. Beinrucker, U. Dogan, G. Blanchard.
Early Stopping for Mutual Information
Based Feature Selection.
International Conference on Pattern Recognition (ICPR 2012) ,
975-978, 2012.
- A. Beinrucker, U. Dogan, G. Blanchard.
A simple extension of stability feature selection.
Pattern Recognition (Proceedings of the joint 34th DAGM and 36th OAGM Symposium), 256-265, 2012.
- G. Blanchard, G. Lee, C. Scott.
Generalizing from Several Related Classification Tasks to a New Unlabeled Sample.
Advances in Neural Inf. Proc. Systems (NIPS 2011), 2178-2186, 2011.
[NIPS proceedings]
- M. Kloft, G. Blanchard.
The Local Rademacher Complexity of lp-Norm Multiple Kernel Learning.
Advances in Neural Inf. Proc. Systems 24 (NIPS 2011), 2438-2446, 2011.
[NIPS proceedings]
- G. Blanchard, N. Krämer.
Optimal learning rates for Kernel Conjugate Gradient regression.
Advances in Neural Inf. Proc. Systems (NIPS 2010), 226-234, 2011.
[NIPS proceedings]
- G. Blanchard, T. Dickhaus, N. Hack, F. Konietschke,
K. Rohmeyer, J. Rosenblatt, M. Scheer, W. Werft.
μTOSS - Multiple hypothesis testing in an open software system.
Proceedings of the First Workshop on Applications of Pattern Analysis,
JMLR Workshop and
Conference Proceedings 11:12-19, 2010.
[pdf]
- G. Blanchard, N. Krämer.
Kernel Partial Least Squares is Universally Consistent.
AISTATS 2010,
JMLR Workshop and
Conference Proceedings 9:57-64, 2010.
[pdf]
- C. Scott, G. Blanchard.
Novelty detection: unlabeled data definitely help. AISTATS 2009,
JMLR Workshop and
Conference Proceedings 5:464-471, 2009.
[pdf]
- G. Blanchard,
F. Fleuret.
Occam's Hammer.
Proceedings of the 20th. conference on learning theory
(COLT 2007), Springer Lecture Notes on Computer Science
(4539), 112-126, 2007.
- S. Arlot, G. Blanchard,
E. Roquain.
Resampling-based
confidence regions and multiple tests for a correlated random vector.
Proceedings of the 20th. conference on learning theory
(COLT 2007), Springer Lecture Notes on Computer Science
(4539), 127-141, 2007.
- M. Kawanabe, G. Blanchard,
M. Sugiyama, V. Spokoiny, K.-R. Müller.
A novel dimension reduction procedure for searching
non-gaussian subspaces.
Independent Component Analysis and Blind Signal
Separation (ICA 06), Springer Lecture Notes on Computer Science
(3889), 149-156, 2006.
- M. Sugiyama, M. Kawanabe, G. Blanchard,
V. Spokoiny, K.-R. Müller.
Obtaining the best linear unbiased estimator of noisy
signals by non-gaussian component analysis.
IEEE International Conference on Acoustics, Speech,
and Signal Processing (ICACSSP 06),
volume 3, pp.608-611, 2006.
- L. Zwald, G. Blanchard.
On the convergence of eigenspaces in kernel principal
components analysis. In
Advances in Neural Inf. Proc. Systems (NIPS 05),
volume 18, 1649-1656, MIT Press, 2006.
[NIPS Proceedings]
- F. Fleuret, G. Blanchard.
Pattern recognition from one example via chopping. In
Advances in Neural Inf. Proc. Systems (NIPS 05),
volume 18, 371-378, MIT Press, 2006.
[NIPS Proceedings]
- G. Blanchard, M. Kawanabe,
M. Sugiyama, V. Spokoiny, K.-R. Müller.
Non-Gaussian component analysis: a semi-parametric
framework for linear dimension reduction. In
Advances in Neural Inf. Proc. Systems (NIPS 05),
volume 18, 131-138, MIT Press, 2006.
[NIPS Proceedings]
- G. Blanchard, P. Massart, R. Vert, L. Zwald.
Kernel Projection Machine: a new tool for pattern recognition.
In Advances in Neural Inf. Proc. Systems (NIPS
2004), volume 17, 1649-1656, MIT Press, 2005.
[NIPS Proceedings]
- G. Blanchard, C. Schäfer,
Y. Rozenholc. Oracle bounds and exact algorithm for dyadic
classification trees. In Proceedings of the 17th. Conference on
Learning Theory (COLT 04), Springer Lecture
Notes in Artificial Intelligence (3120), 378-392, 2004.
- O. Bousquet, L. Zwald, G. Blanchard.
Statistical properties of kernel principal component analysis. In Proceedings
of the 17th. Conference on Learning Theory (COLT 2004).
Springer Lecture Notes in Artificial Intelligence (3120), 594-608,
2004.
|
PhD Thesis
- G. Blanchard. Méthodes de
mélange et d'agrégation d'estimateurs en reconnaissance de
formes. Application aux arbres de décision. (Mixture and
aggregation of estimators for pattern recognition. Application to
decision trees.) PhD dissertation, Université
Paris 13 - Paris Nord, 2001. (In English, with an introductory part in
French) [gzipped
ps]
|
Technical reports and unpublished
- G. Blanchard, N. Mücke.
Kernel regression, minimax rates and effective dimensionality: beyond the regular case.
[arXiv]
- G. Blanchard, D. Geman.
Hierarchical testing designs for pattern recognition. Technical report
2003-07, Université Paris-Sud, 2003.
[gzipped
ps]
- Y. Amit, G. Blanchard. Multiple
Randomized Classifiers. Technical report, University of Chicago, 2001.
[gzipped
ps]
- G. Blanchard. Modélisations
déterministe et stochastique d'un apprentissage
élémentaire. (Rapport de DEA/Master's thesis)
[gzipped
ps]
|