https://en.wikipedia.org/w/index.php?action=history&feed=atom&title=Approximation_algorithmApproximation algorithm - Revision history2025-05-31T11:05:46ZRevision history for this page on the wikiMediaWiki 1.45.0-wmf.3https://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1287316159&oldid=prev2001:638:708:306:290E:2B4C:E37E:90EA: /* Performance guarantees */ fixed a typo2025-04-25T12:31:44Z<p><span class="autocomment">Performance guarantees: </span> fixed a typo</p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 12:31, 25 April 2025</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 65:</td>
<td colspan="2" class="diff-lineno">Line 65:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Performance guarantees ==</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Performance guarantees ==</div></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>For some approximation algorithms it is possible to prove certain properties about the approximation of <del style="font-weight: bold; text-decoration: none;">th.e</del> optimum result. For example, a '''''ρ''-approximation algorithm''' ''A'' is defined to be an algorithm for which it has been proven that the value/cost, ''f''(''x''), of the approximate solution ''A''(''x'') to an instance ''x'' will not be more (or less, depending on the situation) than a factor ''ρ'' times the value, OPT, of an optimum solution.</div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>For some approximation algorithms it is possible to prove certain properties about the approximation of <ins style="font-weight: bold; text-decoration: none;">the</ins> optimum result. For example, a '''''ρ''-approximation algorithm''' ''A'' is defined to be an algorithm for which it has been proven that the value/cost, ''f''(''x''), of the approximate solution ''A''(''x'') to an instance ''x'' will not be more (or less, depending on the situation) than a factor ''ρ'' times the value, OPT, of an optimum solution.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>:<math>\begin{cases}\mathrm{OPT} \leq f(x) \leq \rho \mathrm{OPT},\qquad\mbox{if } \rho > 1; \\ \rho \mathrm{OPT} \leq f(x) \leq \mathrm{OPT},\qquad\mbox{if } \rho < 1.\end{cases}</math></div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>:<math>\begin{cases}\mathrm{OPT} \leq f(x) \leq \rho \mathrm{OPT},\qquad\mbox{if } \rho > 1; \\ \rho \mathrm{OPT} \leq f(x) \leq \mathrm{OPT},\qquad\mbox{if } \rho < 1.\end{cases}</math></div></td>
</tr>
</table>2001:638:708:306:290E:2B4C:E37E:90EAhttps://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1282072267&oldid=prevGiftlite: /* Introduction */ wlnk2025-03-24T03:45:50Z<p><span class="autocomment">Introduction: </span> wlnk</p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 03:45, 24 March 2025</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 7:</td>
<td colspan="2" class="diff-lineno">Line 7:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Introduction ==</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Introduction ==</div></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>A simple example of an approximation algorithm is one for the [[Vertex cover|minimum vertex cover]] problem, where the goal is to choose the smallest set of vertices such that every edge in the input graph contains at least one chosen vertex. One way to find a vertex cover is to repeat the following process: find an uncovered edge, add both its endpoints to the cover, and remove all edges incident to either vertex from the graph. As any vertex cover of the input graph must use a distinct vertex to cover each edge that was considered in the process (since it forms a [[Matching (graph theory)|matching]]), the vertex cover produced, therefore, is at most twice as large as the optimal one. In other words, this is a [[constant-factor approximation algorithm]] with an approximation factor of 2. Under the recent [[unique games conjecture]], this factor is even the best possible one.<ref>{{Cite journal|last1=Khot|first1=Subhash|last2=Regev|first2=Oded|date=2008-05-01|title=Vertex cover might be hard to approximate to within 2−ε|journal=Journal of Computer and System Sciences|series=Computational Complexity 2003|volume=74|issue=3|pages=335–349|doi=10.1016/j.jcss.2007.06.019|doi-access=free}}</ref></div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>A simple example of an approximation algorithm is one for the [[Vertex cover|minimum vertex cover]] problem, where the goal is to choose the smallest set of vertices such that every edge in the input graph contains at least one chosen vertex. One way to find a <ins style="font-weight: bold; text-decoration: none;">[[</ins>vertex cover<ins style="font-weight: bold; text-decoration: none;">]]</ins> is to repeat the following process: find an uncovered edge, add both its endpoints to the cover, and remove all edges incident to either vertex from the graph. As any vertex cover of the input graph must use a distinct vertex to cover each edge that was considered in the process (since it forms a [[Matching (graph theory)|matching]]), the vertex cover produced, therefore, is at most twice as large as the optimal one. In other words, this is a [[constant-factor approximation algorithm]] with an approximation factor of 2. Under the recent [[unique games conjecture]], this factor is even the best possible one.<ref>{{Cite journal|last1=Khot|first1=Subhash|last2=Regev|first2=Oded|date=2008-05-01|title=Vertex cover might be hard to approximate to within 2−ε|journal=Journal of Computer and System Sciences|series=Computational Complexity 2003|volume=74|issue=3|pages=335–349|doi=10.1016/j.jcss.2007.06.019|doi-access=free}}</ref></div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>NP-hard problems vary greatly in their approximability; some, such as the [[knapsack problem]], can be approximated within a multiplicative factor <math>1 + \epsilon</math>, for any fixed <math>\epsilon > 0</math>, and therefore produce solutions arbitrarily close to the optimum (such a family of approximation algorithms is called a [[polynomial-time approximation scheme]] or PTAS). Others are impossible to approximate within any constant, or even polynomial, factor unless [[P = NP]], as in the case of the [[Clique problem|maximum clique problem]]. Therefore, an important benefit of studying approximation algorithms is a fine-grained classification of the difficulty of various NP-hard problems beyond the one afforded by the [[NP-completeness|theory of NP-completeness]]. In other words, although NP-complete problems may be equivalent (under polynomial-time reductions) to each other from the perspective of exact solutions, the corresponding optimization problems behave very differently from the perspective of approximate solutions.</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>NP-hard problems vary greatly in their approximability; some, such as the [[knapsack problem]], can be approximated within a multiplicative factor <math>1 + \epsilon</math>, for any fixed <math>\epsilon > 0</math>, and therefore produce solutions arbitrarily close to the optimum (such a family of approximation algorithms is called a [[polynomial-time approximation scheme]] or PTAS). Others are impossible to approximate within any constant, or even polynomial, factor unless [[P = NP]], as in the case of the [[Clique problem|maximum clique problem]]. Therefore, an important benefit of studying approximation algorithms is a fine-grained classification of the difficulty of various NP-hard problems beyond the one afforded by the [[NP-completeness|theory of NP-completeness]]. In other words, although NP-complete problems may be equivalent (under polynomial-time reductions) to each other from the perspective of exact solutions, the corresponding optimization problems behave very differently from the perspective of approximate solutions.</div></td>
</tr>
</table>Giftlitehttps://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1266631785&oldid=prev132.75.176.232: /* Hardness of approximation */Nvm2025-01-01T14:34:15Z<p><span class="autocomment">Hardness of approximation: </span>Nvm</p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 14:34, 1 January 2025</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 28:</td>
<td colspan="2" class="diff-lineno">Line 28:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Hardness of approximation ==</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Hardness of approximation ==</div></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>Approximation algorithms as a research area is closely related to and informed by [[Hardness of approximation|inapproximability theory]] where the non-existence of efficient algorithms with certain approximation ratios is proved (conditioned on widely believed hypotheses such as the P ≠ NP conjecture) by means of [[Reduction (complexity)|reductions]]. In the case of the metric traveling salesman problem, the best known inapproximability result rules out algorithms with an approximation ratio less than 123/122 ≈ 1.<del style="font-weight: bold; text-decoration: none;">008197</del> unless P = NP, Karpinski, Lampis, Schmied.<ref>{{Cite journal|last1=Karpinski|first1=Marek|last2=Lampis|first2=Michael|last3=Schmied|first3=Richard|date=2015-12-01|title=New inapproximability bounds for TSP|journal=Journal of Computer and System Sciences|volume=81|issue=8|pages=1665–1677|doi=10.1016/j.jcss.2015.06.003|arxiv=1303.6437}}</ref> Coupled with the knowledge of the existence of Christofides' 1.5 approximation algorithm, this tells us that the threshold of approximability for metric traveling salesman (if it exists) is somewhere between 123/122 and 1.5.</div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>Approximation algorithms as a research area is closely related to and informed by [[Hardness of approximation|inapproximability theory]] where the non-existence of efficient algorithms with certain approximation ratios is proved (conditioned on widely believed hypotheses such as the P ≠ NP conjecture) by means of [[Reduction (complexity)|reductions]]. In the case of the metric traveling salesman problem, the best known inapproximability result rules out algorithms with an approximation ratio less than 123/122 ≈ 1.<ins style="font-weight: bold; text-decoration: none;">008196</ins> unless P = NP, Karpinski, Lampis, Schmied.<ref>{{Cite journal|last1=Karpinski|first1=Marek|last2=Lampis|first2=Michael|last3=Schmied|first3=Richard|date=2015-12-01|title=New inapproximability bounds for TSP|journal=Journal of Computer and System Sciences|volume=81|issue=8|pages=1665–1677|doi=10.1016/j.jcss.2015.06.003|arxiv=1303.6437}}</ref> Coupled with the knowledge of the existence of Christofides' 1.5 approximation algorithm, this tells us that the threshold of approximability for metric traveling salesman (if it exists) is somewhere between 123/122 and 1.5.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>While inapproximability results have been proved since the 1970s, such results were obtained by ad hoc means and no systematic understanding was available at the time. It is only since the 1990 result of Feige, Goldwasser, Lovász, Safra and Szegedy on the inapproximability of [[Independent set (graph theory)|Independent Set]]<ref>{{Cite journal|last1=Feige|first1=Uriel|last2=Goldwasser|first2=Shafi|last3=Lovász|first3=Laszlo|last4=Safra|first4=Shmuel|last5=Szegedy|first5=Mario|date=March 1996|title=Interactive Proofs and the Hardness of Approximating Cliques|journal=J. ACM|volume=43|issue=2|pages=268–292|doi=10.1145/226643.226652|issn=0004-5411|doi-access=free}}</ref> and the famous [[PCP theorem]],<ref>{{Cite journal|last1=Arora|first1=Sanjeev|last2=Safra|first2=Shmuel|date=January 1998|title=Probabilistic Checking of Proofs: A New Characterization of NP|journal=J. ACM|volume=45|issue=1|pages=70–122|doi=10.1145/273865.273901|s2cid=751563|issn=0004-5411|doi-access=free}}</ref> that modern tools for proving inapproximability results were uncovered. The PCP theorem, for example, shows that [[David S. Johnson|Johnson's]] 1974 approximation algorithms for [[Maximum satisfiability problem|Max SAT]], [[Set cover problem|set cover]], [[Independent set (graph theory)|independent set]] and [[Graph coloring|coloring]] all achieve the optimal approximation ratio, assuming P ≠ NP.<ref>{{Cite journal|last=Johnson|first=David S.|date=1974-12-01|title=Approximation algorithms for combinatorial problems|journal=Journal of Computer and System Sciences|volume=9|issue=3|pages=256–278|doi=10.1016/S0022-0000(74)80044-9|doi-access=free}}</ref></div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>While inapproximability results have been proved since the 1970s, such results were obtained by ad hoc means and no systematic understanding was available at the time. It is only since the 1990 result of Feige, Goldwasser, Lovász, Safra and Szegedy on the inapproximability of [[Independent set (graph theory)|Independent Set]]<ref>{{Cite journal|last1=Feige|first1=Uriel|last2=Goldwasser|first2=Shafi|last3=Lovász|first3=Laszlo|last4=Safra|first4=Shmuel|last5=Szegedy|first5=Mario|date=March 1996|title=Interactive Proofs and the Hardness of Approximating Cliques|journal=J. ACM|volume=43|issue=2|pages=268–292|doi=10.1145/226643.226652|issn=0004-5411|doi-access=free}}</ref> and the famous [[PCP theorem]],<ref>{{Cite journal|last1=Arora|first1=Sanjeev|last2=Safra|first2=Shmuel|date=January 1998|title=Probabilistic Checking of Proofs: A New Characterization of NP|journal=J. ACM|volume=45|issue=1|pages=70–122|doi=10.1145/273865.273901|s2cid=751563|issn=0004-5411|doi-access=free}}</ref> that modern tools for proving inapproximability results were uncovered. The PCP theorem, for example, shows that [[David S. Johnson|Johnson's]] 1974 approximation algorithms for [[Maximum satisfiability problem|Max SAT]], [[Set cover problem|set cover]], [[Independent set (graph theory)|independent set]] and [[Graph coloring|coloring]] all achieve the optimal approximation ratio, assuming P ≠ NP.<ref>{{Cite journal|last=Johnson|first=David S.|date=1974-12-01|title=Approximation algorithms for combinatorial problems|journal=Journal of Computer and System Sciences|volume=9|issue=3|pages=256–278|doi=10.1016/S0022-0000(74)80044-9|doi-access=free}}</ref></div></td>
</tr>
</table>132.75.176.232https://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1266631602&oldid=prev132.75.176.232: /* Hardness of approximation */123/122=1.008196721311≈1.008197≈×1.0081962025-01-01T14:32:59Z<p><span class="autocomment">Hardness of approximation: </span>123/122=1.008196721311≈1.008197≈×1.008196</p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 14:32, 1 January 2025</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 28:</td>
<td colspan="2" class="diff-lineno">Line 28:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Hardness of approximation ==</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Hardness of approximation ==</div></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>Approximation algorithms as a research area is closely related to and informed by [[Hardness of approximation|inapproximability theory]] where the non-existence of efficient algorithms with certain approximation ratios is proved (conditioned on widely believed hypotheses such as the P ≠ NP conjecture) by means of [[Reduction (complexity)|reductions]]. In the case of the metric traveling salesman problem, the best known inapproximability result rules out algorithms with an approximation ratio less than 123/122 ≈ 1.<del style="font-weight: bold; text-decoration: none;">008196</del> unless P = NP, Karpinski, Lampis, Schmied.<ref>{{Cite journal|last1=Karpinski|first1=Marek|last2=Lampis|first2=Michael|last3=Schmied|first3=Richard|date=2015-12-01|title=New inapproximability bounds for TSP|journal=Journal of Computer and System Sciences|volume=81|issue=8|pages=1665–1677|doi=10.1016/j.jcss.2015.06.003|arxiv=1303.6437}}</ref> Coupled with the knowledge of the existence of Christofides' 1.5 approximation algorithm, this tells us that the threshold of approximability for metric traveling salesman (if it exists) is somewhere between 123/122 and 1.5.</div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>Approximation algorithms as a research area is closely related to and informed by [[Hardness of approximation|inapproximability theory]] where the non-existence of efficient algorithms with certain approximation ratios is proved (conditioned on widely believed hypotheses such as the P ≠ NP conjecture) by means of [[Reduction (complexity)|reductions]]. In the case of the metric traveling salesman problem, the best known inapproximability result rules out algorithms with an approximation ratio less than 123/122 ≈ 1.<ins style="font-weight: bold; text-decoration: none;">008197</ins> unless P = NP, Karpinski, Lampis, Schmied.<ref>{{Cite journal|last1=Karpinski|first1=Marek|last2=Lampis|first2=Michael|last3=Schmied|first3=Richard|date=2015-12-01|title=New inapproximability bounds for TSP|journal=Journal of Computer and System Sciences|volume=81|issue=8|pages=1665–1677|doi=10.1016/j.jcss.2015.06.003|arxiv=1303.6437}}</ref> Coupled with the knowledge of the existence of Christofides' 1.5 approximation algorithm, this tells us that the threshold of approximability for metric traveling salesman (if it exists) is somewhere between 123/122 and 1.5.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>While inapproximability results have been proved since the 1970s, such results were obtained by ad hoc means and no systematic understanding was available at the time. It is only since the 1990 result of Feige, Goldwasser, Lovász, Safra and Szegedy on the inapproximability of [[Independent set (graph theory)|Independent Set]]<ref>{{Cite journal|last1=Feige|first1=Uriel|last2=Goldwasser|first2=Shafi|last3=Lovász|first3=Laszlo|last4=Safra|first4=Shmuel|last5=Szegedy|first5=Mario|date=March 1996|title=Interactive Proofs and the Hardness of Approximating Cliques|journal=J. ACM|volume=43|issue=2|pages=268–292|doi=10.1145/226643.226652|issn=0004-5411|doi-access=free}}</ref> and the famous [[PCP theorem]],<ref>{{Cite journal|last1=Arora|first1=Sanjeev|last2=Safra|first2=Shmuel|date=January 1998|title=Probabilistic Checking of Proofs: A New Characterization of NP|journal=J. ACM|volume=45|issue=1|pages=70–122|doi=10.1145/273865.273901|s2cid=751563|issn=0004-5411|doi-access=free}}</ref> that modern tools for proving inapproximability results were uncovered. The PCP theorem, for example, shows that [[David S. Johnson|Johnson's]] 1974 approximation algorithms for [[Maximum satisfiability problem|Max SAT]], [[Set cover problem|set cover]], [[Independent set (graph theory)|independent set]] and [[Graph coloring|coloring]] all achieve the optimal approximation ratio, assuming P ≠ NP.<ref>{{Cite journal|last=Johnson|first=David S.|date=1974-12-01|title=Approximation algorithms for combinatorial problems|journal=Journal of Computer and System Sciences|volume=9|issue=3|pages=256–278|doi=10.1016/S0022-0000(74)80044-9|doi-access=free}}</ref></div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>While inapproximability results have been proved since the 1970s, such results were obtained by ad hoc means and no systematic understanding was available at the time. It is only since the 1990 result of Feige, Goldwasser, Lovász, Safra and Szegedy on the inapproximability of [[Independent set (graph theory)|Independent Set]]<ref>{{Cite journal|last1=Feige|first1=Uriel|last2=Goldwasser|first2=Shafi|last3=Lovász|first3=Laszlo|last4=Safra|first4=Shmuel|last5=Szegedy|first5=Mario|date=March 1996|title=Interactive Proofs and the Hardness of Approximating Cliques|journal=J. ACM|volume=43|issue=2|pages=268–292|doi=10.1145/226643.226652|issn=0004-5411|doi-access=free}}</ref> and the famous [[PCP theorem]],<ref>{{Cite journal|last1=Arora|first1=Sanjeev|last2=Safra|first2=Shmuel|date=January 1998|title=Probabilistic Checking of Proofs: A New Characterization of NP|journal=J. ACM|volume=45|issue=1|pages=70–122|doi=10.1145/273865.273901|s2cid=751563|issn=0004-5411|doi-access=free}}</ref> that modern tools for proving inapproximability results were uncovered. The PCP theorem, for example, shows that [[David S. Johnson|Johnson's]] 1974 approximation algorithms for [[Maximum satisfiability problem|Max SAT]], [[Set cover problem|set cover]], [[Independent set (graph theory)|independent set]] and [[Graph coloring|coloring]] all achieve the optimal approximation ratio, assuming P ≠ NP.<ref>{{Cite journal|last=Johnson|first=David S.|date=1974-12-01|title=Approximation algorithms for combinatorial problems|journal=Journal of Computer and System Sciences|volume=9|issue=3|pages=256–278|doi=10.1016/S0022-0000(74)80044-9|doi-access=free}}</ref></div></td>
</tr>
</table>132.75.176.232https://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1229752640&oldid=prev5.102.3.156: /* Performingnce guarantees */2024-06-18T15:02:43Z<p><span class="autocomment">Performingnce guarantees</span></p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 15:02, 18 June 2024</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 64:</td>
<td colspan="2" class="diff-lineno">Line 64:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>The approximation can be proven ''tight'' (''tight approximation'') by demonstrating that there exist instances where the algorithm performs at the approximation limit, indicating the tightness of the bound. In this case, it's enough to construct an input instance designed to force the algorithm into a worst-case scenario.</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>The approximation can be proven ''tight'' (''tight approximation'') by demonstrating that there exist instances where the algorithm performs at the approximation limit, indicating the tightness of the bound. In this case, it's enough to construct an input instance designed to force the algorithm into a worst-case scenario.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>== <del style="font-weight: bold; text-decoration: none;">Performingnce</del> guarantees ==</div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>== <ins style="font-weight: bold; text-decoration: none;">Performance</ins> guarantees ==</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>For some approximation algorithms it is possible to prove certain properties about the approximation of th.e optimum result. For example, a '''''ρ''-approximation algorithm''' ''A'' is defined to be an algorithm for which it has been proven that the value/cost, ''f''(''x''), of the approximate solution ''A''(''x'') to an instance ''x'' will not be more (or less, depending on the situation) than a factor ''ρ'' times the value, OPT, of an optimum solution.</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>For some approximation algorithms it is possible to prove certain properties about the approximation of th.e optimum result. For example, a '''''ρ''-approximation algorithm''' ''A'' is defined to be an algorithm for which it has been proven that the value/cost, ''f''(''x''), of the approximate solution ''A''(''x'') to an instance ''x'' will not be more (or less, depending on the situation) than a factor ''ρ'' times the value, OPT, of an optimum solution.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
</table>5.102.3.156https://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1229752597&oldid=prev5.102.3.156: /* Structure of approximation algorithms */2024-06-18T15:02:27Z<p><span class="autocomment">Structure of approximation algorithms</span></p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 15:02, 18 June 2024</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 59:</td>
<td colspan="2" class="diff-lineno">Line 59:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>Specifically, having <math>A_{\Pi}(i) \in S_i</math>, the algorithm has an '''approximation factor''' (or '''approximation ratio''') of <math>\rho(n)</math> if <math>\forall i \in I \ s.t. |i| = n</math>, we have:</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>Specifically, having <math>A_{\Pi}(i) \in S_i</math>, the algorithm has an '''approximation factor''' (or '''approximation ratio''') of <math>\rho(n)</math> if <math>\forall i \in I \ s.t. |i| = n</math>, we have:</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>* for a ''minimization'' problem: <math>\frac{c(A_{\Pi}(i)}{c(s^*(i))}\leq\rho(n)</math>, which in turn means the solution taken by the algorithm divided by the optimal solution achieves a ratio of <math>\rho(n)</math>;</div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>* for a ''minimization'' problem: <math>\frac{c(A_{\Pi}(i<ins style="font-weight: bold; text-decoration: none;">)</ins>)}{c(s^*(i))}\leq\rho(n)</math>, which in turn means the solution taken by the algorithm divided by the optimal solution achieves a ratio of <math>\rho(n)</math>;</div></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>* for a ''maximization'' problem: <math>\frac{c(s^*(i))}{c(A_{\Pi}(i)}\leq\rho(n)</math>, which in turn means the optimal solution divided by the solution taken by the algorithm achieves a ratio of <math>\rho(n)</math>;</div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>* for a ''maximization'' problem: <math>\frac{c(s^*(i))}{c(A_{\Pi}(i<ins style="font-weight: bold; text-decoration: none;">)</ins>)}\leq\rho(n)</math>, which in turn means the optimal solution divided by the solution taken by the algorithm achieves a ratio of <math>\rho(n)</math>;</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>The approximation can be proven ''tight'' (''tight approximation'') by demonstrating that there exist instances where the algorithm performs at the approximation limit, indicating the tightness of the bound. In this case, it's enough to construct an input instance designed to force the algorithm into a worst-case scenario.</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>The approximation can be proven ''tight'' (''tight approximation'') by demonstrating that there exist instances where the algorithm performs at the approximation limit, indicating the tightness of the bound. In this case, it's enough to construct an input instance designed to force the algorithm into a worst-case scenario.</div></td>
</tr>
</table>5.102.3.156https://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1229752476&oldid=prev5.102.3.156: /* Structure of approximation algorithms */2024-06-18T15:01:44Z<p><span class="autocomment">Structure of approximation algorithms</span></p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 15:01, 18 June 2024</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 62:</td>
<td colspan="2" class="diff-lineno">Line 62:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>* for a ''maximization'' problem: <math>\frac{c(s^*(i))}{c(A_{\Pi}(i)}\leq\rho(n)</math>, which in turn means the optimal solution divided by the solution taken by the algorithm achieves a ratio of <math>\rho(n)</math>;</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>* for a ''maximization'' problem: <math>\frac{c(s^*(i))}{c(A_{\Pi}(i)}\leq\rho(n)</math>, which in turn means the optimal solution divided by the solution taken by the algorithm achieves a ratio of <math>\rho(n)</math>;</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>The approximation can be proven ''tight'' (''tight approximation'') by <del style="font-weight: bold; text-decoration: none;">d</del></div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>The approximation can be proven ''tight'' (''tight approximation'') by <ins style="font-weight: bold; text-decoration: none;">demonstrating that there exist instances where the algorithm performs at the approximation limit, indicating the tightness of the bound. In this case, it's enough to construct an input instance designed to force the algorithm into a worst-case scenario.</ins></div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Performingnce guarantees ==</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Performingnce guarantees ==</div></td>
</tr>
</table>5.102.3.156https://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1229752355&oldid=prev5.102.3.156: /* Performance guarantees */2024-06-18T15:01:02Z<p><span class="autocomment">Performance guarantees</span></p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 15:01, 18 June 2024</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 38:</td>
<td colspan="2" class="diff-lineno">Line 38:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>In other cases, even if the initial results are of purely theoretical interest, over time, with an improved understanding, the algorithms may be refined to become more practical. One such example is the initial PTAS for [[Euclidean traveling salesman problem|Euclidean TSP]] by [[Sanjeev Arora]] (and independently by [[Joseph S. B. Mitchell|Joseph Mitchell]]) which had a prohibitive running time of <math>n^{O(1/\epsilon)}</math> for a <math>1+\epsilon</math> approximation.<ref>{{Cite book|last=Arora|first=S.|title=Proceedings of 37th Conference on Foundations of Computer Science |chapter=Polynomial time approximation schemes for Euclidean TSP and other geometric problems |date=October 1996|pages=2–11|doi=10.1109/SFCS.1996.548458|isbn=978-0-8186-7594-2|citeseerx=10.1.1.32.3376|s2cid=1499391}}</ref> Yet, within a year these ideas were incorporated into a near-linear time <math>O(n\log n)</math> algorithm for any constant <math>\epsilon > 0</math>.<ref>{{Cite book|last=Arora|first=S.|title=Proceedings 38th Annual Symposium on Foundations of Computer Science |date=October 1997|chapter=Nearly linear time approximation schemes for Euclidean TSP and other geometric problems|pages=554–563|doi=10.1109/SFCS.1997.646145|isbn=978-0-8186-8197-4|s2cid=10656723}}</ref></div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>In other cases, even if the initial results are of purely theoretical interest, over time, with an improved understanding, the algorithms may be refined to become more practical. One such example is the initial PTAS for [[Euclidean traveling salesman problem|Euclidean TSP]] by [[Sanjeev Arora]] (and independently by [[Joseph S. B. Mitchell|Joseph Mitchell]]) which had a prohibitive running time of <math>n^{O(1/\epsilon)}</math> for a <math>1+\epsilon</math> approximation.<ref>{{Cite book|last=Arora|first=S.|title=Proceedings of 37th Conference on Foundations of Computer Science |chapter=Polynomial time approximation schemes for Euclidean TSP and other geometric problems |date=October 1996|pages=2–11|doi=10.1109/SFCS.1996.548458|isbn=978-0-8186-7594-2|citeseerx=10.1.1.32.3376|s2cid=1499391}}</ref> Yet, within a year these ideas were incorporated into a near-linear time <math>O(n\log n)</math> algorithm for any constant <math>\epsilon > 0</math>.<ref>{{Cite book|last=Arora|first=S.|title=Proceedings 38th Annual Symposium on Foundations of Computer Science |date=October 1997|chapter=Nearly linear time approximation schemes for Euclidean TSP and other geometric problems|pages=554–563|doi=10.1109/SFCS.1997.646145|isbn=978-0-8186-8197-4|s2cid=10656723}}</ref></div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>== Structure of approximation algorithms ==</div></td>
</tr>
<tr>
<td class="diff-marker"><a class="mw-diff-movedpara-left" title="Paragraph was moved. Click to jump to new location." href="#movedpara_5_24_rhs">⚫</a></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div><a name="movedpara_2_0_lhs"></a>== <del style="font-weight: bold; text-decoration: none;">Performance</del> guarantees ==</div></td>
<td colspan="2" class="diff-empty diff-side-added"></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>Given an optimization problem:</div></td>
</tr>
<tr>
<td class="diff-marker"><a class="mw-diff-movedpara-left" title="Paragraph was moved. Click to jump to new location." href="#movedpara_5_25_rhs">⚫</a></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div><a name="movedpara_4_0_lhs"></a>For some approximation algorithms it is possible to prove certain properties about the approximation of <del style="font-weight: bold; text-decoration: none;">the</del> optimum result. For example, a '''''ρ''-approximation algorithm''' ''A'' is defined to be an algorithm for which it has been proven that the value/cost, ''f''(''x''), of the approximate solution ''A''(''x'') to an instance ''x'' will not be more (or less, depending on the situation) than a factor ''ρ'' times the value, OPT, of an optimum solution.</div></td>
<td colspan="2" class="diff-empty diff-side-added"></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div><math>\Pi: I \times S</math></div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>where <math>\Pi</math> is an approximation problem, <math>I</math> the set of inputs and <math>S</math> the set of solutions, we can define the cost function:</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div><math>c: S \rightarrow \mathbb{R}^+</math></div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>and the set of feasible solutions:</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div><math>\forall i \in I, S(i) = {s \in S: i\Pi_s}</math></div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>finding the best solution <math>s^*</math> for a maximization or a minimization problem:</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div><math>s^* \in S(i)</math>, <math>c(s^*) = min/max \ c(S(i))</math></div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>Given a feasible solution <math>s \in S(i)</math>, with <math>s \neq s^*</math>, we would want a guarantee of the quality of the solution, which is a performance to be guaranteed (approximation factor).</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>Specifically, having <math>A_{\Pi}(i) \in S_i</math>, the algorithm has an '''approximation factor''' (or '''approximation ratio''') of <math>\rho(n)</math> if <math>\forall i \in I \ s.t. |i| = n</math>, we have:</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>* for a ''minimization'' problem: <math>\frac{c(A_{\Pi}(i)}{c(s^*(i))}\leq\rho(n)</math>, which in turn means the solution taken by the algorithm divided by the optimal solution achieves a ratio of <math>\rho(n)</math>;</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>* for a ''maximization'' problem: <math>\frac{c(s^*(i))}{c(A_{\Pi}(i)}\leq\rho(n)</math>, which in turn means the optimal solution divided by the solution taken by the algorithm achieves a ratio of <math>\rho(n)</math>;</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>The approximation can be proven ''tight'' (''tight approximation'') by d</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker"><a class="mw-diff-movedpara-right" title="Paragraph was moved. Click to jump to old location." href="#movedpara_2_0_lhs">⚫</a></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div><a name="movedpara_5_24_rhs"></a>== <ins style="font-weight: bold; text-decoration: none;">Performingnce</ins> guarantees ==</div></td>
</tr>
<tr>
<td colspan="2" class="diff-empty diff-side-deleted"></td>
<td class="diff-marker"><a class="mw-diff-movedpara-right" title="Paragraph was moved. Click to jump to old location." href="#movedpara_4_0_lhs">⚫</a></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div><a name="movedpara_5_25_rhs"></a>For some approximation algorithms it is possible to prove certain properties about the approximation of <ins style="font-weight: bold; text-decoration: none;">th.e</ins> optimum result. For example, a '''''ρ''-approximation algorithm''' ''A'' is defined to be an algorithm for which it has been proven that the value/cost, ''f''(''x''), of the approximate solution ''A''(''x'') to an instance ''x'' will not be more (or less, depending on the situation) than a factor ''ρ'' times the value, OPT, of an optimum solution.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>:<math>\begin{cases}\mathrm{OPT} \leq f(x) \leq \rho \mathrm{OPT},\qquad\mbox{if } \rho > 1; \\ \rho \mathrm{OPT} \leq f(x) \leq \mathrm{OPT},\qquad\mbox{if } \rho < 1.\end{cases}</math></div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>:<math>\begin{cases}\mathrm{OPT} \leq f(x) \leq \rho \mathrm{OPT},\qquad\mbox{if } \rho > 1; \\ \rho \mathrm{OPT} \leq f(x) \leq \mathrm{OPT},\qquad\mbox{if } \rho < 1.\end{cases}</math></div></td>
</tr>
</table>5.102.3.156https://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1170144023&oldid=prevOAbot: Open access bot: doi added to citation with #oabot.2023-08-13T11:07:37Z<p><a href="/wiki/Wikipedia:OABOT" class="mw-redirect" title="Wikipedia:OABOT">Open access bot</a>: doi added to citation with #oabot.</p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 11:07, 13 August 2023</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 30:</td>
<td colspan="2" class="diff-lineno">Line 30:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>Approximation algorithms as a research area is closely related to and informed by [[Hardness of approximation|inapproximability theory]] where the non-existence of efficient algorithms with certain approximation ratios is proved (conditioned on widely believed hypotheses such as the P ≠ NP conjecture) by means of [[Reduction (complexity)|reductions]]. In the case of the metric traveling salesman problem, the best known inapproximability result rules out algorithms with an approximation ratio less than 123/122 ≈ 1.008196 unless P = NP, Karpinski, Lampis, Schmied.<ref>{{Cite journal|last1=Karpinski|first1=Marek|last2=Lampis|first2=Michael|last3=Schmied|first3=Richard|date=2015-12-01|title=New inapproximability bounds for TSP|journal=Journal of Computer and System Sciences|volume=81|issue=8|pages=1665–1677|doi=10.1016/j.jcss.2015.06.003|arxiv=1303.6437}}</ref> Coupled with the knowledge of the existence of Christofides' 1.5 approximation algorithm, this tells us that the threshold of approximability for metric traveling salesman (if it exists) is somewhere between 123/122 and 1.5.</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>Approximation algorithms as a research area is closely related to and informed by [[Hardness of approximation|inapproximability theory]] where the non-existence of efficient algorithms with certain approximation ratios is proved (conditioned on widely believed hypotheses such as the P ≠ NP conjecture) by means of [[Reduction (complexity)|reductions]]. In the case of the metric traveling salesman problem, the best known inapproximability result rules out algorithms with an approximation ratio less than 123/122 ≈ 1.008196 unless P = NP, Karpinski, Lampis, Schmied.<ref>{{Cite journal|last1=Karpinski|first1=Marek|last2=Lampis|first2=Michael|last3=Schmied|first3=Richard|date=2015-12-01|title=New inapproximability bounds for TSP|journal=Journal of Computer and System Sciences|volume=81|issue=8|pages=1665–1677|doi=10.1016/j.jcss.2015.06.003|arxiv=1303.6437}}</ref> Coupled with the knowledge of the existence of Christofides' 1.5 approximation algorithm, this tells us that the threshold of approximability for metric traveling salesman (if it exists) is somewhere between 123/122 and 1.5.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>While inapproximability results have been proved since the 1970s, such results were obtained by ad hoc means and no systematic understanding was available at the time. It is only since the 1990 result of Feige, Goldwasser, Lovász, Safra and Szegedy on the inapproximability of [[Independent set (graph theory)|Independent Set]]<ref>{{Cite journal|last1=Feige|first1=Uriel|last2=Goldwasser|first2=Shafi|last3=Lovász|first3=Laszlo|last4=Safra|first4=Shmuel|last5=Szegedy|first5=Mario|date=March 1996|title=Interactive Proofs and the Hardness of Approximating Cliques|journal=J. ACM|volume=43|issue=2|pages=268–292|doi=10.1145/226643.226652|issn=0004-5411|doi-access=free}}</ref> and the famous [[PCP theorem]],<ref>{{Cite journal|last1=Arora|first1=Sanjeev|last2=Safra|first2=Shmuel|date=January 1998|title=Probabilistic Checking of Proofs: A New Characterization of NP|journal=J. ACM|volume=45|issue=1|pages=70–122|doi=10.1145/273865.273901|s2cid=751563|issn=0004-5411}}</ref> that modern tools for proving inapproximability results were uncovered. The PCP theorem, for example, shows that [[David S. Johnson|Johnson's]] 1974 approximation algorithms for [[Maximum satisfiability problem|Max SAT]], [[Set cover problem|set cover]], [[Independent set (graph theory)|independent set]] and [[Graph coloring|coloring]] all achieve the optimal approximation ratio, assuming P ≠ NP.<ref>{{Cite journal|last=Johnson|first=David S.|date=1974-12-01|title=Approximation algorithms for combinatorial problems|journal=Journal of Computer and System Sciences|volume=9|issue=3|pages=256–278|doi=10.1016/S0022-0000(74)80044-9|doi-access=free}}</ref></div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>While inapproximability results have been proved since the 1970s, such results were obtained by ad hoc means and no systematic understanding was available at the time. It is only since the 1990 result of Feige, Goldwasser, Lovász, Safra and Szegedy on the inapproximability of [[Independent set (graph theory)|Independent Set]]<ref>{{Cite journal|last1=Feige|first1=Uriel|last2=Goldwasser|first2=Shafi|last3=Lovász|first3=Laszlo|last4=Safra|first4=Shmuel|last5=Szegedy|first5=Mario|date=March 1996|title=Interactive Proofs and the Hardness of Approximating Cliques|journal=J. ACM|volume=43|issue=2|pages=268–292|doi=10.1145/226643.226652|issn=0004-5411|doi-access=free}}</ref> and the famous [[PCP theorem]],<ref>{{Cite journal|last1=Arora|first1=Sanjeev|last2=Safra|first2=Shmuel|date=January 1998|title=Probabilistic Checking of Proofs: A New Characterization of NP|journal=J. ACM|volume=45|issue=1|pages=70–122|doi=10.1145/273865.273901|s2cid=751563|issn=0004-5411<ins style="font-weight: bold; text-decoration: none;">|doi-access=free</ins>}}</ref> that modern tools for proving inapproximability results were uncovered. The PCP theorem, for example, shows that [[David S. Johnson|Johnson's]] 1974 approximation algorithms for [[Maximum satisfiability problem|Max SAT]], [[Set cover problem|set cover]], [[Independent set (graph theory)|independent set]] and [[Graph coloring|coloring]] all achieve the optimal approximation ratio, assuming P ≠ NP.<ref>{{Cite journal|last=Johnson|first=David S.|date=1974-12-01|title=Approximation algorithms for combinatorial problems|journal=Journal of Computer and System Sciences|volume=9|issue=3|pages=256–278|doi=10.1016/S0022-0000(74)80044-9|doi-access=free}}</ref></div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Practicality ==</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Practicality ==</div></td>
</tr>
</table>OAbothttps://en.wikipedia.org/w/index.php?title=Approximation_algorithm&diff=1168968958&oldid=prevCitation bot: Alter: title, template type. Add: chapter-url, chapter. Removed or converted URL. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox3 | #UCB_webform_linked 134/23062023-08-06T05:58:53Z<p>Alter: title, template type. Add: chapter-url, chapter. Removed or converted URL. Removed parameters. Some additions/deletions were parameter name changes. | <a href="/wiki/Wikipedia:UCB" class="mw-redirect" title="Wikipedia:UCB">Use this bot</a>. <a href="/wiki/Wikipedia:DBUG" class="mw-redirect" title="Wikipedia:DBUG">Report bugs</a>. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox3 | #UCB_webform_linked 134/2306</p>
<table style="background-color: #fff; color: #202122;" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">← Previous revision</td>
<td colspan="2" style="background-color: #fff; color: #202122; text-align: center;">Revision as of 05:58, 6 August 2023</td>
</tr><tr>
<td colspan="2" class="diff-lineno">Line 4:</td>
<td colspan="2" class="diff-lineno">Line 4:</td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>The design and [[mathematical analysis|analysis]] of approximation algorithms crucially involves a [[mathematical proof]] certifying the quality of the returned solutions in the worst case.<ref name="Bernard. 2011"/> This distinguishes them from [[heuristic (computer science)|heuristics]] such as [[Simulated annealing|annealing]] or [[genetic algorithm]]s, which find reasonably good solutions on some inputs, but provide no clear indication at the outset on when they may succeed or fail.</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>The design and [[mathematical analysis|analysis]] of approximation algorithms crucially involves a [[mathematical proof]] certifying the quality of the returned solutions in the worst case.<ref name="Bernard. 2011"/> This distinguishes them from [[heuristic (computer science)|heuristics]] such as [[Simulated annealing|annealing]] or [[genetic algorithm]]s, which find reasonably good solutions on some inputs, but provide no clear indication at the outset on when they may succeed or fail.</div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker" data-marker="−"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>There is widespread interest in [[theoretical computer science]] to better understand the limits to which we can approximate certain famous optimization problems. For example, one of the long-standing open questions in computer science is to determine whether there is an algorithm that outperforms the 2-approximation for the Steiner Forest problem by Agrawal et al.<ref>{{Cite <del style="font-weight: bold; text-decoration: none;">journal</del> |last1=Agrawal |first1=Ajit |last2=Klein |first2=Philip |last3=Ravi |first3=R.<del style="font-weight: bold; text-decoration: none;"> |date=1991</del> |title=<del style="font-weight: bold; text-decoration: none;">When</del> <del style="font-weight: bold; text-decoration: none;">trees</del> <del style="font-weight: bold; text-decoration: none;">collide:</del> <del style="font-weight: bold; text-decoration: none;">an</del> <del style="font-weight: bold; text-decoration: none;">approximation</del> <del style="font-weight: bold; text-decoration: none;">algorithm</del> <del style="font-weight: bold; text-decoration: none;">for</del> <del style="font-weight: bold; text-decoration: none;">the</del> <del style="font-weight: bold; text-decoration: none;">generalized</del> <del style="font-weight: bold; text-decoration: none;">Steiner</del> <del style="font-weight: bold; text-decoration: none;">problem</del> <del style="font-weight: bold; text-decoration: none;">on</del> <del style="font-weight: bold; text-decoration: none;">networks</del> |url=http://portal.acm.org/citation.cfm?doid=103418.103437<del style="font-weight: bold; text-decoration: none;"> |journal=Proceedings of the Twenty-third Annual ACM Symposium on Theory of Computing - STOC '91</del> |language=en |location=New Orleans, Louisiana, United States |publisher=ACM Press |pages=134–144 |doi=10.1145/103418.103437 |isbn=978-0-89791-397-3|s2cid=1245448 }}</ref> The desire to understand hard optimization problems from the perspective of approximability is motivated by the discovery of surprising mathematical connections and broadly applicable techniques to design algorithms for hard optimization problems. One well-known example of the former is the [[Semidefinite programming#Example 3 (Goemans–Williamson max cut approximation algorithm)|Goemans–Williamson algorithm]] for [[maximum cut]], which solves a graph theoretic problem using high dimensional geometry.<ref>{{Cite journal|last1=Goemans|first1=Michel X.|last2=Williamson|first2=David P.|date=November 1995|title=Improved Approximation Algorithms for Maximum Cut and Satisfiability Problems Using Semidefinite Programming|journal=J. ACM|volume=42|issue=6|pages=1115–1145|doi=10.1145/227683.227684|issn=0004-5411|citeseerx=10.1.1.34.8500|s2cid=15794408}}</ref></div></td>
<td class="diff-marker" data-marker="+"></td>
<td style="color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>There is widespread interest in [[theoretical computer science]] to better understand the limits to which we can approximate certain famous optimization problems. For example, one of the long-standing open questions in computer science is to determine whether there is an algorithm that outperforms the 2-approximation for the Steiner Forest problem by Agrawal et al.<ref>{{Cite <ins style="font-weight: bold; text-decoration: none;">book</ins> |last1=Agrawal |first1=Ajit |last2=Klein |first2=Philip |last3=Ravi |first3=R. |title=<ins style="font-weight: bold; text-decoration: none;">Proceedings</ins> <ins style="font-weight: bold; text-decoration: none;">of</ins> <ins style="font-weight: bold; text-decoration: none;">the</ins> <ins style="font-weight: bold; text-decoration: none;">twenty-third</ins> <ins style="font-weight: bold; text-decoration: none;">annual</ins> <ins style="font-weight: bold; text-decoration: none;">ACM</ins> <ins style="font-weight: bold; text-decoration: none;">symposium</ins> <ins style="font-weight: bold; text-decoration: none;">on</ins> <ins style="font-weight: bold; text-decoration: none;">Theory</ins> <ins style="font-weight: bold; text-decoration: none;">of</ins> <ins style="font-weight: bold; text-decoration: none;">computing</ins> <ins style="font-weight: bold; text-decoration: none;">-</ins> <ins style="font-weight: bold; text-decoration: none;">STOC '91</ins> |<ins style="font-weight: bold; text-decoration: none;">chapter=When trees collide |date=1991 |chapter-</ins>url=http://portal.acm.org/citation.cfm?doid=103418.103437 |language=en |location=New Orleans, Louisiana, United States |publisher=ACM Press |pages=134–144 |doi=10.1145/103418.103437 |isbn=978-0-89791-397-3|s2cid=1245448 }}</ref> The desire to understand hard optimization problems from the perspective of approximability is motivated by the discovery of surprising mathematical connections and broadly applicable techniques to design algorithms for hard optimization problems. One well-known example of the former is the [[Semidefinite programming#Example 3 (Goemans–Williamson max cut approximation algorithm)|Goemans–Williamson algorithm]] for [[maximum cut]], which solves a graph theoretic problem using high dimensional geometry.<ref>{{Cite journal|last1=Goemans|first1=Michel X.|last2=Williamson|first2=David P.|date=November 1995|title=Improved Approximation Algorithms for Maximum Cut and Satisfiability Problems Using Semidefinite Programming|journal=J. ACM|volume=42|issue=6|pages=1115–1145|doi=10.1145/227683.227684|issn=0004-5411|citeseerx=10.1.1.34.8500|s2cid=15794408}}</ref></div></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><br /></td>
</tr>
<tr>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Introduction ==</div></td>
<td class="diff-marker"></td>
<td style="background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>== Introduction ==</div></td>
</tr>
</table>Citation bot