power law distribution
A statistical relationship where a small number of items account for the majority of the outcomes. In venture capital, a few highly successful investments (winners) generate the vast majority of a fund's returns.
entitydetail.created_at
8/16/2025, 2:37:27 AM
entitydetail.last_updated
8/16/2025, 2:39:43 AM
entitydetail.research_retrieved
8/16/2025, 2:39:43 AM
Summary
A power law distribution describes a fundamental functional relationship where a relative change in one quantity leads to a proportional relative change in another quantity, raised to a constant exponent, independent of their initial sizes. This means one quantity varies as a power of another. A simple illustration is the relationship between a square's area and its side length, where doubling the side length quadruples the area (2²). In the context of venture capital, understanding and identifying "power law winners"—companies such as Palantir, Uber, Airbnb, Spotify, and Figma that generate disproportionately large returns—is considered a core strategy for success, as highlighted in discussions about the VC model.
Referenced in 1 Document
Research Data
Extracted Attributes
Field
Statistics, Mathematics, Data Science
Definition
A functional relationship where a relative change in one quantity results in a proportional relative change in another quantity, raised to a constant exponent, meaning one quantity varies as a power of another.
Plotting Method
Often plotted on doubly logarithmic axes to emphasize the upper tail region.
Alternative Names
Scaling law, Scale-free distribution
Key Characteristic
The change is independent of the initial size of the quantities.
Common Applications
Venture Capital (identifying 'power law winners'), wealth distribution, earthquake magnitudes (Gutenberg-Richter law), website traffic, book sales, city sizes, word frequencies (Zipf's law)
Mathematical Form (Continuous)
p(x) = Cx⁻ᵅ, where C is a constant and α > 0.
Moment Behavior (for 2 < α < 3)
The mean exists, but the variance and higher-order moments are infinite, implying central moment estimators may not converge with more data.
Mathematical Form (Probability Distribution)
P(X>x) ~ L(x)x⁻⁽ᵅ⁻¹⁾ for large x, where L(x) is a slowly varying function.
Wikipedia
View on WikipediaPower law
In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a relative change in the other quantity proportional to the change raised to a constant exponent: one quantity varies as a power of another. The change is independent of the initial size of those quantities. For instance, the area of a square has a power law relationship with the length of its side, since if the length is doubled, the area is multiplied by 22, while if the length is tripled, the area is multiplied by 32, and so on.
Web Search Results
- Power law - Wikipedia
{\displaystyle f(x)\propto x^{-\alpha }e^{-\beta x}.} #### Curved power law {\displaystyle f(x)\propto x^{\alpha +\beta x}} ## Power-law probability distributions In a looser sense, a power-law probability distribution is a distribution whose density function (or mass function in the discrete case) has the form, for large values of x {\displaystyle x} {\displaystyle x}, {\displaystyle x} {\displaystyle P(X>x)\sim L(x)x^{-(\alpha -1)}} where [...] {\displaystyle 2<\alpha <3}, the mean exists, but the variance and higher-order moments are infinite, etc. For finite-size samples drawn from such distribution, this behavior implies that the central moment estimators (like the mean and the variance) for diverging moments will never converge – as more data is accumulated, they continue to grow. These power-law probability distributions are also called Pareto-type distributions, distributions with Pareto tails, or distributions with regularly [...] In general, power-law distributions are plotted on doubly logarithmic axes, which emphasizes the upper tail region. The most convenient way to do this is via the (complementary) cumulative distribution "Cumulative distribution function") (ccdf) that is, the survival function, P ( x ) = P r ( X > x ) {\displaystyle P(x)=\mathrm {Pr} (X>x)} {\displaystyle P(x)=\mathrm {Pr} (X>x)},
- [PDF] Power laws, Pareto distributions and Zipf's law
The line α = 2 thus separates the regime in which you will with some frequency meet people with uncommon names from the regime in which you will rarely meet such people. E. Scale-free distributions A power-law distribution is also sometimes called a scale-free distribution. Why? Because a power law is the only distribution that is the same whatever scale we look at it on. By this we mean the following. [...] II. THE MATHEMATICS OF POWER LAWS A continuous real variable with a power-law distri-bution has a probability p(x) dx of taking a value in the interval from x to x + dx, where p(x) = Cx−α, (7) with α > 0. As we saw in Section I.A, there must be some lowest value xmin at which the power law is obeyed, and we consider only the statistics of x above this value. [...] (33) Setting x = 1 we find that the constant is simply ln p(1), and then taking exponentials of both sides p(x) = p(1) x−α, (34) where α = −p(1)/p′(1). Thus, as advertised, the power-law distribution is the only function satisfying the scale-free criterion (29). This fact is more than just a curiosity.
- Power Law: A Pattern Behind Extreme Events | DataCamp
Introduction to Probability Rules Cheat Sheet Big O Notation and Time Complexity Guide Generalized Linear Models in Python --- Author Vikash Singh Seasoned professional in data science, artificial intelligence, analytics, and data strategy. ## FAQs ### What is a power law distribution in simple terms? A power law describes a pattern where small events are common and large ones are rare—but still possible. Think of wealth, earthquakes, or website traffic. [...] The zeta distribution is a discrete version of the power law that is often used to model things that happen a lot, like website hits or book sales. [...] Power laws help us understand extreme events in nature. For example, the Gutenberg-Richter law states that earthquake magnitudes follow a power law distribution. This means that small tremors happen a lot, but as the magnitude of the quake gets bigger, the number of quakes drops off quickly. That's why a 2.0 quake might happen every day, but a 7.0 quake might not happen for years. Both are part of the same statistical story.
- Power Laws: How Nonlinear Relationships Amplify Results
> > > … We know that, per the power law secret, companies are not evenly distributed. The distribution tends to be bimodal; there are some great ones, and then there are a lot of ones that don’t really work at all. But understanding this isn’t enough. There is a big difference between understanding the power law secret in theory and being able to apply it in practice. [...] Thiel then explains why founders should focus on one key revenue stream, rather than trying to build multiple equal ones: > Even within an individual business, there is probably a sort of power law as to what’s going to drive it. It’s troubling if a startup insists that it’s going to make money in many different ways. The power law distribution on revenues says that one source of revenue will dominate everything else. [...] > One shouldn’t be mechanical about this heuristic, or treat it as some immutable investment strategy. But it actually checks out pretty well, so at the very least it compels you to think about power law distribution. > > > Understanding exponents and power law distributions isn’t just about understanding VC. There are important personal applications too. Many things, such as key life decisions or starting businesses, also result in similar distributions.
- Power Law and Power Law Distribution - Statistics How To
The power law (also called the scaling law) states that a relative change in one quantity results in a proportional relative change in another.