Jump to content

Probabilistic analysis of algorithms: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m Reverted edits by 2405:204:224:F56C:9DF:2459:C51F:D24E (talk) (HG) (3.4.6)
Fixed grammar.
Tags: Mobile edit Mobile web edit
Line 1: Line 1:
In [[analysis of algorithms]], '''probabilistic analysis of algorithms''' is an approach to estimate the [[Analysis of algorithms|computational complexity]] of an [[algorithm]] or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all possible inputs. This assumption is then used to design an efficient algorithm or to derive the complexity of a known algorithms.
In [[analysis of algorithms]], '''probabilistic analysis of algorithms''' is an approach to estimate the [[Analysis of algorithms|computational complexity]] of an [[algorithm]] or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all possible inputs. This assumption is then used to design an efficient algorithm or to derive the complexity of a known algorithm.


This approach is not the same as that of [[probabilistic algorithm]]s, but the two may be combined.
This approach is not the same as that of [[probabilistic algorithm]]s, but the two may be combined.


For non-probabilistic, more specifically, for [[deterministic algorithm]]s, the most common types of complexity estimates are the [[average-case complexity]] ('''expected time complexity'''){{Dubious|date=March 2011}} and the almost always complexity. To obtain the average-case complexity, given an input distribution, the expected time of an algorithm is evaluated, whereas for the almost always complexity estimate, it is evaluated that the algorithm admits a given complexity estimate that [[almost surely]] holds.
For non-probabilistic, more specifically [[deterministic algorithm]]s, the most common types of complexity estimates are the [[average-case complexity]] (expected-time complexity{{Dubious|date=March 2011}} and the almost-always complexity. To obtain the average-case complexity, given an input distribution, the expected time of an algorithm is evaluated, whereas for the almost-always complexity estimate, it is evaluated that the algorithm admits a given complexity estimate that [[almost surely]] holds.


In probabilistic analysis of probabilistic (randomized) algorithms, the distributions or averaging for all possible choices in randomized steps are also taken into an account, in addition to the input distributions.
In probabilistic analysis of probabilistic (randomized) algorithms, the distributions or averaging for all possible choices in randomized steps are also taken into an account, in addition to the input distributions.

Revision as of 12:15, 16 December 2019

In analysis of algorithms, probabilistic analysis of algorithms is an approach to estimate the computational complexity of an algorithm or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all possible inputs. This assumption is then used to design an efficient algorithm or to derive the complexity of a known algorithm.

This approach is not the same as that of probabilistic algorithms, but the two may be combined.

For non-probabilistic, more specifically deterministic algorithms, the most common types of complexity estimates are the average-case complexity (expected-time complexity[dubiousdiscuss] and the almost-always complexity. To obtain the average-case complexity, given an input distribution, the expected time of an algorithm is evaluated, whereas for the almost-always complexity estimate, it is evaluated that the algorithm admits a given complexity estimate that almost surely holds.

In probabilistic analysis of probabilistic (randomized) algorithms, the distributions or averaging for all possible choices in randomized steps are also taken into an account, in addition to the input distributions.

See also