It is a long-standing open question whether the average-case hardness of NP implies the existence of a one-way function. The hypothetical world in which this does not hold is called Pessiland, which is the most pessimistic among Impagliazzo's five possible worlds. In this paper, we present the first "sharp" characterization of Pessiland: - NP is hard on average if and only if the minimum description length of programs in agnostic learning is hard to approximate on average with an approximation factor $\ell / \mathrm{polylog}(\ell)$, where $\ell$ is a new complexity measure of a distribution called advice complexity of sampling. - A one-way function does not exist if and only if the minimum description length of programs in agnostic learning is easy to approximate on average with an approximation factor $O(\ell)$. In particular, Pessiland is ruled out if and only if the small quantitative gap in approximation factors $\ell/\mathrm{polylog}(\ell)$ and $O(\ell)$ is closed. Our characterization is based on an optimal NP-hardness result for the Collective Minimum Monotone Satisfying Assignment (CMMSA) Problem, whose task is, given as input a collection of monotone formulas with at most $\ell$ literals, to compute the minimum weight of an assignment that satisfies as many monotone formulas as possible. We prove the NP-hardness of approximating the minimum weight within a factor of $\ell / \mathrm{polylog} \ell$, improving the previous inapproximability factor of $\ell^{\Omega(1)}$ by Hirahara (FOCS 2022). Our inapproximability factor is optimal up to the $\mathrm{polylog} \ell$ factor unless NP $\subseteq$ coAM because the CMMSA problem with an approximation factor $O(\ell)$ is in coAM.
It is a long-standing open question whether the average-case hardness of NP implies the existence of a one-way function. The hypothetical world in which this does not hold is called Pessiland, which is the most pessimistic among Impagliazzo's five possible worlds. In this paper, we present the first "sharp" characterization of Pessiland: - NP is hard on average if and only if the minimum description length of programs in agnostic learning is hard to approximate on average with an approximation factor $\ell / \mathrm{polylog}(\ell)$, where $\ell$ is a new complexity measure of a distribution called advice complexity of sampling. - A one-way function does not exist if and only if the minimum description length of programs in agnostic learning is easy to approximate on average with an approximation factor $O(\ell)$. In particular, Pessiland is ruled out if and only if the small quantitative gap in approximation factors $\ell/\mathrm{polylog}(\ell)$ and $O(\ell)$ is closed. Our characterization is based on an optimal NP-hardness result for the Collective Minimum Monotone Satisfying Assignment (CMMSA) Problem, whose task is, given as input a collection of monotone formulas with at most $\ell$ literals, to compute the minimum weight of an assignment that satisfies as many monotone formulas as possible. We prove the NP-hardness of approximating the minimum weight within a factor of $\ell / \mathrm{polylog} \ell$, improving the previous inapproximability factor of $\ell^{\Omega(1)}$ by Hirahara (FOCS 2022). Our inapproximability factor is optimal up to the $\mathrm{polylog} \ell$ factor unless NP $\subseteq$ coAM because the CMMSA problem with an approximation factor $O(\ell)$ is in coAM.
