,

On Value, Velocity and Monetary Theory

A New Approach to Cryptoasset Valuations

Valuation methodologies have historically lagged behind the development of the assets they represent. While the Dutch East India Company became the first entity to sell stocks on a public exchange in the early 1600s, it was not until the 20th century that a comprehensive framework for deriving the fundamental value of equity securities was developed. What Graham and Dodd benefited from in 1934 that their predecessors perhaps lacked was a broadly-accepted philosophy of disclosure (eventually codified in the Securities Act of 1933) and, more importantly, a reliable accounting system with unified measurement standards and practices— a common language for discussing value. Without rules of disclosure and requisite accounting conventions, current attempts at studying cryptoasset fundamentals will descend into the Confusion of Confusions that described seventeenth century stock market investment advice.

In this piece, I propose an extension to the prevailing methodology for valuing cryptoassets — one that I hope will alleviate confusion by clarifying the vocabulary used in discussions of value. In the first part of the post, I survey current debates on cryptoasset fundamentals and investigate their core monetary assumptions. I find current valuation models to insufficiently capture the complexities of these conversations, motivating a new approach, which I outline in the second part of this post. The proposed method intends to disjoin demand for commodities and demand for money by placing each asset in a broader economy of return expectations and friction constraints. It is important to note, before continuing, that valuation theorists generally caution against valuation of non-cash-flow-generating assets. As such, the methodologies outlined below remain largely exploratory and imprecise. Nonetheless, I believe these discussions to be valuable in developing directional insights on cryptoasset value, which can be a key lever for projects in optimizing their incentive structures (I write in more detail about this process of ‘mechanism design’ here).

A Review of Prevailing Cryptoasset Valuation Frameworks

The prevailing approach to cryptoasset valuations is undoubtedly that most clearly articulated by Chris Burniske, which can be found in his book, as well as this bandwidth token model (for brevity, I refer to this as the “INET model” from here on out). Brett Winton’s social media token model is largely similar in its approach. The type of asset being modeled in the above takes on different names depending on who you ask — appcoin, medium-of-exchange token, cryptocommodity, proprietary-payment token, utility token. Though each of these terms captures some subtlety, you may consider them interchangeable for our purposes below. The overarching idea is that such an asset serves as the exclusive form of payment that the network will accept in exchange for an underlying scarce resource that it provides (bandwidth, storage, computation, and so forth).

The core thesis of current valuation frameworks is that utility value can be derived by (a) forecasting demand for the underlying resource that a network provisions (the network’s ‘GDP’) and (b) dividing this figure by the monetary base available for its fulfillment to obtain per-unit utility value. Present values can be derived from future expected utility values using conventional discounting. The theoretical framework that nearly all these valuation models employ is the equation of exchange, MV=PQ. While you will see this formula appear in nearly all discussions of cryptoasset value, it is often given little explanation and even less theoretical treatment. I attempt to offer some clarity on this in below, but for now let’s review the basic terms of the equation. M refers to tokens in circulation and V refers to their ‘velocity’ or the number of times each token changes hands in a given period (though we will expand this definition later on). On the right side, P refers to price and Q to quantity (or transaction value in real terms if we are using the so-called Fisher formulation, MV=PT). The right-hand side is the “real economy,” or ‘GDP,’ while the left is the “monetary economy.” The two operate in parallel to each other.

Returning to the models linked to above, both hard-code a constant velocity, project a money supply schedule, a price/quantity schedule, and use the aforementioned equality to derive value per coin. Of course, the output of the model is extremely sensitive to variations in the hard-coded values for both the discount rate and velocity, neither of which is explored in detail. This is not problematic per se as these models are intended to provide nothing more than directional insight. The problem has arisen as the discussion on value in the investment community has honed in on the “velocity” component, in what I will call the “velocity thesis.” The consequent lack of theoretical clarity in these discussions is a key motivator of this piece.

The Velocity Thesis

To understand the current debates on velocity and value, I suggest reading the following if you have not already: Crypto Valuation and 95 Crypto Theses for 2018 by Ryan Selkis; The Blockchain Token Velocity Problem by Kyle Samani; An (Institutional) Investor’s Take on Cryptoassets by John Pfeffer; On Medium-of-Exchange Token Valuations by Vitalik Buterin. I especially recommend the latter two as they form the inspiration for the approach detailed hereinafter.

(Note: I refer to these and other pieces under the “velocity thesis” for short-hand convenience. This is not to suggest that the people listed above do not have significant points of disagreement with each other with respect to the concept of value (they do). To the extent possible, I attempt to highlight these differences.)

The uniting argument in the above articles is that tokens that are not store-of-value assets will generally suffer from high velocity at scale as users avoid holding the asset for meaningful periods of time, suppressing ultimate value. My claim here is that this thesis is directionally correct, but hard to operationalize. Much of the difficulty, in my view, derives from an inability to precisely define, measure, and project velocity over time. The first step to ameliorating this is to better understand the theoretical framework it derives from. Correspondingly, I find it instructive to take a step back and explain velocity more clearly before more formally restating the thesis.

Most of the above articles define velocity as “the number of times money changes hands”, or formulaically as V=PQ/M, by reference to the equation of exchange. This, of course, leaves us none the wiser as to how to model velocity, as the equation of exchange is nothing more than an identity. MV=PQ just says that the money flow of expenditures is equal to the market value of what those expenditures buy, which is true by definition. The left and right sides are two ways of saying the same thing; it’s a form of double-entry accounting where each transaction is simultaneously recorded on both sides of the equation. Whether an effect should be recorded in M, V, P, or Q is, ultimately, arbitrary. To transform the identity into a tool with predictive potency, we need to make a series of assumptions about each of the variables. For example, monetarists assume M is determined exogenously, V is constant, and Q is independent of M and use the equation to demonstrate how increases in the money supply increase P (i.e. cause inflation). Initial M, Q, and P levels are usually separately estimated from different data sources, often leading to significant discrepancies. Meanwhile V is extremely hard to observe directly. As such, by convention, “[velocities] have generally been calculated as the numbers having the property that they render the equation correct. These calculated numbers therefore embody the whole of the counterpart to the statistical discrepancy.” (See Friedman).

Based on this understanding, we can now make three observations about the velocity thesis:

1.) The first practical problem with velocity is that it’s frequently employed as a catch-all to make the two sides of the equation of exchange balance. It often simply captures the error in our estimation of the other variables in the model. Without further specification, saying that projects should minimize velocity is about as useful as saying that a business should attempt to maximize goodwill on its balance sheet. As velocity in its pure form is difficult to directly observe (the best attempt I am aware of is outlined in section 3.4 of the BlockSci paper), current (tautological) definitions of velocity as PQ/M need to be reformulated to allow us to estimate velocity separately from the other variables and project its fluctuations over time. Returning to our historical analogy, we need to develop better accounting definitions before we can truly operationalize velocity and related concepts (e.g. PE, RoA, PB would all be meaningless if we could not agree on how to define and measure them across businesses and over time).

2.) If token value is equal to PQ/VM, we can agree that the problem is not “high” vs “low” velocity, as some writing the topic seems to suggest (though that certainly matters, all things being equal). The real question is how changes in velocity correlate with changes in PQ. Strong positive correlations approaching 1 effectively decouple token value from network transaction growth (note that while this is a drag on the upside, it is protective of value on the downside). If the two are uncorrelated, then token utility value grows (and declines) linearly with demand for the underlying utility (this is what happens in the INET model). Negative correlations act as a lever, generating outsized price swings relative to growth or decline in PQ. Therefore, we need to compare the growth rates of velocity and PQ to formalize the velocity thesis: The thesis states that if velocity grows faster than PQ, token utility value declines. Vitalik and Pfeffer argue that this will likely happen given sufficiently low transaction friction and superior return/store-of-value alternatives. The validity (or invalidity) of this prediction is not something we can explore thoroughly in current models.

3.) Before continuing, I would note, as a practical matter, that I would caution against the suggestions of some proponents of the velocity thesis for projects to create artificial velocity suppression mechanisms. I view staking, mint-and-burn, and other “sinks” as useful only to the extent that they genuinely improve governance and user experience as opposed to acting as artificial price props. Furthermore, I believe velocity can only be practically useful if it is precisely defined and measurable. Let the finance folk argue about it for now, but until you can define and measure it, I would say you can safely ignore it and go back to building.

In sum, we can now concretely state that to support (or refute) the velocity thesis, we need to examine how velocity, PQ, and utility value relate to each other over time, requiring a way to operationalize velocity and project it over time. This, in turn, necessitates a slight departure from existing cryptoasset valuation methodology.

A Monetary Critique of the INET Model

Now that we’ve examined the velocity thesis in some detail, we can articulate six challenges that it presents to the INET model from the standpoint of monetary theory:

1.) Demand for money and demand for commodities are conceptually distinct and should be accordingly decoupled in monetary valuation models. In the INET model, demand for INET and demand for bandwidth are practically one and the same. We need to model demand for money separately from demand for commodities.

2.) Cryptoassets are embedded in a broader asset universe. One way of looking at the INET model is as a single-good, single-asset economy. Utility tokens should instead be seen to compete with other assets for holders. As a corollary, expected returns of other assets should affect demand for the asset being modeled.

3.) Related to (2), transactional demand and speculative demand are conceptually distinct and have different drivers. In the INET model, speculative demand is captured in a supply reduction (HODLed coins are removed from circulation). If the asset is not a store of value, as per the velocity thesis, it may be more realistic to direct speculative demand to another asset and capture this store-of-value demand within the money demand curve.

4.) Blockchain economies are (currently) highly frictional. These frictions could be in the form of transaction costs/fees, cognitive costs, illiquidity, inconvenience, and others. We need to model these frictions, project their behavior over time, and trace their effects on demand for different assets.

5.) Velocity (a) varies over time and (b) should be measurable in a way that is (at least somewhat) distinct from the other variables. I.e. In the INET model, velocity and PQ have no correlation, while price and PQ have a perfect 1.0 correlation coefficient. We instead need to model endogenous velocity as time-varying and distinct from the money supply term to better study the effects of PQ movements on value accretion.

6.) Expanding on (5) payment patterns matter. In INET, patterns of payments don’t matter as long as they all sum to the same transactional volume. If users purchase one large token balance each year and spend it gradually over the year, the effects on value are different than if users made many small purchases to finance consumption as needed. We need a way of modeling different payment patterns and their effects on velocity and value.

More broadly, I believe the community’s understanding of cryptoasset value drivers has outgrown the simple articulation of value in the INET model. And Chris seems to agree.

Complete Article: https://medium.com

0 Kommentare

Hinterlasse einen Kommentar

An der Diskussion beteiligen?
Hinterlasse uns deinen Kommentar!

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert

Diese Website verwendet Akismet, um Spam zu reduzieren. Erfahre mehr darüber, wie deine Kommentardaten verarbeitet werden.