det.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Mastodon Server des Unterhaltungsfernsehen Ehrenfeld zum dezentralen Diskurs.

Administered by:

Server stats:

1.8K
active users

#minimization

0 posts0 participants0 posts today
JMLR<p>'Affine Rank Minimization via Asymptotic Log-Det Iteratively Reweighted Least Squares', by Sebastian Krämer.</p><p><a href="http://jmlr.org/papers/v26/23-0943.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/23-0943.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/minimizers" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimizers</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a></p>
JMLR<p>'Stabilizing Sharpness-Aware Minimization Through A Simple Renormalization Strategy', by Chengli Tan, Jiangshe Zhang, Junmin Liu, Yicheng Wang, Yunda Hao.</p><p><a href="http://jmlr.org/papers/v26/24-0065.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/24-0065.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sgd" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>sgd</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/renormalization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>renormalization</span></a></p>
JMLR<p>'Adjusted Expected Improvement for Cumulative Regret Minimization in Noisy Bayesian Optimization', by Shouri Hu, Haowei Wang, Zhongxiang Dai, Bryan Kian Hsiang Low, Szu Hui Ng.</p><p><a href="http://jmlr.org/papers/v26/22-0523.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v26/22-0523.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/regret" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regret</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>
JMLR<p>'Triple Component Matrix Factorization: Untangling Global, Local, and Noisy Components', by Naichen Shi, Salar Fattahi, Raed Al Kontar.</p><p><a href="http://jmlr.org/papers/v25/24-0400.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/24-0400.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/factorization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>factorization</span></a> <a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>sparse</span></a></p>
JMLR<p>'Efficient Active Manifold Identification via Accelerated Iteratively Reweighted Nuclear Norm Minimization', by Hao Wang, Ye Wang, Xiangyu Yang.</p><p><a href="http://jmlr.org/papers/v25/23-0449.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-0449.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/smoothing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>smoothing</span></a></p>
JMLR<p>'Stochastic Regularized Majorization-Minimization with weakly convex and multi-convex surrogates', by Hanbaek Lyu.</p><p><a href="http://jmlr.org/papers/v25/23-0349.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-0349.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/regularization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regularization</span></a> <a href="https://sigmoid.social/tags/regularized" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>regularized</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>
JMLR<p>'Low-Rank Matrix Estimation in the Presence of Change-Points', by Lei Shi, Guanghui Wang, Changliang Zou.</p><p><a href="http://jmlr.org/papers/v25/22-0852.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/22-0852.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/matrix" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>matrix</span></a> <a href="https://sigmoid.social/tags/trace" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>trace</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>
JMLR<p>'Sharpness-Aware Minimization and the Edge of Stability', by Philip M. Long, Peter L. Bartlett.</p><p><a href="http://jmlr.org/papers/v25/23-1285.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-1285.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/gradient" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gradient</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/hessian" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>hessian</span></a></p>
JMLR<p>'Generalization and Stability of Interpolating Neural Networks with Minimal Width', by Hossein Taheri, Christos Thrampoulidis.</p><p><a href="http://jmlr.org/papers/v25/23-0422.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/23-0422.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/classifiers" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>classifiers</span></a> <a href="https://sigmoid.social/tags/generalization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>generalization</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>
JMLR<p>'Monotonic Risk Relationships under Distribution Shifts for Regularized Risk Minimization', by Daniel LeJeune, Jiayu Liu, Reinhard Heckel.</p><p><a href="http://jmlr.org/papers/v25/22-1197.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/22-1197.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/misclassification" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>misclassification</span></a> <a href="https://sigmoid.social/tags/distributions" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>distributions</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>
JMLR<p>'Lower Complexity Bounds of Finite-Sum Optimization Problems: The Results and Construction', by Yuze Han, Guangzeng Xie, Zhihua Zhang.</p><p><a href="http://jmlr.org/papers/v25/21-0264.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v25/21-0264.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/minimax" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimax</span></a></p>
Eric's Risk Assessment<p>This is an exceptional article about <a href="https://zeroes.ca/tags/COVID" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>COVID</span></a> <a href="https://zeroes.ca/tags/risk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>risk</span></a> <a href="https://zeroes.ca/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> techniques. I wish I had written it. I recommend it to anyone who takes <a href="https://zeroes.ca/tags/SARS2" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SARS2</span></a> <a href="https://zeroes.ca/tags/RiskManagement" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RiskManagement</span></a> seriously.</p><p>tl;dr? Here is a brief summary🧵 </p><p><a href="https://howtohideapandemic.substack.com/p/how-to-hide-a-pandemic" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">howtohideapandemic.substack.co</span><span class="invisible">m/p/how-to-hide-a-pandemic</span></a></p><p>The article summaries the techniques of <a href="https://zeroes.ca/tags/risk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>risk</span></a> <a href="https://zeroes.ca/tags/mininizers" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>mininizers</span></a>, who while telling you to do your own <a href="https://zeroes.ca/tags/riskAssessment" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>riskAssessment</span></a> are also engaged in *propaganda* (and I choose that word advisedly) to establish a different narrative with the objective of manufacturing consent for <a href="https://zeroes.ca/tags/massInfection" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>massInfection</span></a> </p><p>You cannot <a href="https://zeroes.ca/tags/RiskManage" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RiskManage</span></a> what you do not see, think about, talk about or measure. If you want to manage your COVID risk, you need to understand, think around &amp; through the minimization techniques.</p><p>I wrote about this recently in my Raven Rock post, where similar techniques were used during the cold war. As a society, we have done this before, with the risk of nuclear war. And currently, with the risk of climate change.</p><p>Thank you <span class="h-card"><a href="https://kolektiva.social/@anarchademic" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>anarchademic</span></a></span> for highlighting it. I have subscribed.</p>
JMLR<p>'The Dynamics of Sharpness-Aware Minimization: Bouncing Across Ravines and Drifting Towards Wide Minima', by Peter L. Bartlett, Philip M. Long, Olivier Bousquet.</p><p><a href="http://jmlr.org/papers/v24/23-043.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/23-043.htm</span><span class="invisible">l</span></a> <br> <br><a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/gradient" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gradient</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a></p>
JMLR<p>'Near-Optimal Weighted Matrix Completion', by Oscar López.</p><p><a href="http://jmlr.org/papers/v24/22-0331.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/22-0331.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/matrix" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>matrix</span></a> <a href="https://sigmoid.social/tags/matrices" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>matrices</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>
JMLR<p>'Sparse Plus Low Rank Matrix Decomposition: A Discrete Optimization Approach', by Dimitris Bertsimas, Ryan Cory-Wright, Nicholas A. G. Johnson.</p><p><a href="http://jmlr.org/papers/v24/21-1130.html" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/21-1130.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/sparse" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>sparse</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a></p>
JMLR<p>'Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-Start', by Riccardo Grazzi, Massimiliano Pontil, Saverio Salzo.</p><p><a href="http://jmlr.org/papers/v24/22-1043.html" rel="nofollow noopener" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/22-1043.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/optimal" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimal</span></a></p>
JMLR<p>'A Parameter-Free Conditional Gradient Method for Composite Minimization under Hölder Condition', by Masaru Ito, Zhaosong Lu, Chuan He.</p><p><a href="http://jmlr.org/papers/v24/22-0983.html" rel="nofollow noopener" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/22-0983.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/gradient" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>gradient</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>
JMLR<p>'Fast Objective &amp; Duality Gap Convergence for Non-Convex Strongly-Concave Min-Max Problems with PL Condition', by Zhishuai Guo, Yan Yan, Zhuoning Yuan, Tianbao Yang.</p><p><a href="http://jmlr.org/papers/v24/21-1471.html" rel="nofollow noopener" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/21-1471.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/optimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>optimization</span></a> <a href="https://sigmoid.social/tags/convex" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>convex</span></a></p>
JMLR<p>'On Tilted Losses in Machine Learning: Theory and Applications', by Tian Li, Ahmad Beirami, Maziar Sanjabi, Virginia Smith.</p><p><a href="http://jmlr.org/papers/v24/21-1095.html" rel="nofollow noopener" target="_blank"><span class="invisible">http://</span><span class="ellipsis">jmlr.org/papers/v24/21-1095.ht</span><span class="invisible">ml</span></a> <br> <br><a href="https://sigmoid.social/tags/tilting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>tilting</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a> <a href="https://sigmoid.social/tags/risk" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>risk</span></a></p>
Published papers at TMLR<p>Dirichlet Mechanism for Differentially Private KL Divergence Minimization</p><p>Donlapark Ponnoprat</p><p><a href="https://openreview.net/forum?id=lmr2WwlaFc" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">openreview.net/forum?id=lmr2Ww</span><span class="invisible">laFc</span></a></p><p><a href="https://sigmoid.social/tags/privacy" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>privacy</span></a> <a href="https://sigmoid.social/tags/dirichlet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dirichlet</span></a> <a href="https://sigmoid.social/tags/minimization" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>minimization</span></a></p>