Please turn JavaScript on

JMLR

Subscribe to JMLR’s news feed.

Click on “Follow” and decide if you want to get news from JMLR via RSS, as email newsletter, via mobile or on your personal news page.

Subscription to JMLR comes without risk as you can unsubscribe instantly at any time.

You can also filter the feed to your needs via topics and keywords so that you only receive the news from JMLR which you are really interested in. Click on the blue “Filter” button below to get started.

Title: JMLR

Is this your feed? Claim it!

Publisher:  Unclaimed!
Message frequency:  8.97 / day

Message History

MALA is a popular gradient-based Markov chain Monte Carlo method to access the Gibbs-posterior distribution. Stochastic MALA (sMALA) scales to large data sets, but changes the target distribution from the Gibbs-posterior to a surrogate posterior which only exploits a reduced sample size. We introduce a corrected stochastic MALA (csMALA) with a simple correction term for which di...

Read full story
Training deep learning neural networks often requires massive amounts of computational ressources. We propose to sequentially monitor network predictions to trigger retraining only if the predictions are no longer valid. This can reduce drastically computational costs and opens a door to green deep learning. Our approach is based on the relationship to projected second moments m...

Read full story
In the framework of the FD (frequent directions) algorithm, we first develop two efficient algorithms for low-rank matrix approximations under the embedding matrices composed of the product of any SpEmb (sparse embedding) matrix and any standard Gaussian matrix, or any SpEmb matrix and any SRHT (subsampled randomized Hadamard transform) matrix. The theoretical results are also a...

Read full story
The task of causal representation learning aims to uncover latent higher-level causal variables that affect lower-level observations. Identifying the true latent causal variables from observed data, while allowing instantaneous causal relations among latent variables, remains a challenge, however. To this end, we start with the analysis of three intrinsic indeterminacies in iden...

Read full story
Local differential privacy has become a central topic in data privacy research, offering strong privacy guarantees by perturbing user data at the source and removing the need for a trusted curator. However, the noise introduced by local differential privacy often significantly reduces data utility. To address this issue, we reinterpret private learning under local differential p...

Read full story