Have we considered using the Uniswap model for domains, based on length tranches?

It’s hard to know the “right” price for a domain, and I believe to find a fair pricing algorithm is a common trend here, but usually, the hardest part is figuring out the metric we want to optimize. One I have considered now while looking for domains:

"What’s the chance that the name you’re looking for is already taken?"

We can assume that the ability to find available domains should be a worthy goal. Of course, the problem is that the theoretical limit is infinite. But that’s not so in smaller lenght domains.

Uniswap uses a rather straightforward formula: x * y = k.


Where k is a constant and x and y are the variables. The interesting thing about the formula is that it’s an asymptote, in which the price tends to either infinite or almost zero, but never reaches either. How would this work for ENS?

price of a domain * number of domains of that same length = constant K

This means that the very last domain will be infinitely expensive, the very first infinitely cheap, according to a curve we set. So for instance, there are 17,576 three-letter domain (In the english alphabet). Let’s suppose we set the K of this tranch at around 35,000 (2 * 26 * 26 * 26). It would mean that:

If only 1000 three-letter names are registered (less than 10%), then each registration would cost about 2.1 ether.
If 7000 names are (about 50% occupancy) the price surges to 3.5 ether
The 10,000th name will cost 5 ether
If there are 15,000 names taken, then it would cost 17.5 ether
If 17k names were registered (almost 90%) then each new name costs 60 ether.

This could be applied to annual fees going forward, this way it encourages that there will always be a nice amount of names available for registration. Maybe by releasing a name, you get a portion of that amount, meaning that if the prices are going up, you have good reason to drop you unused names. There would be tranches for 3-6 letter names, and then we could use the 6 letter price for everything else since there are over 300 millions of them. This would also mean that for long domains (or non-english ones) prices would always be a small amount.

Would this make sense?

1 Like

I quite like this idea! It seems like it should lead to an equilibrium where names of a given length are priced reasonably for someone who actually wants to use the name, and makes squatting self-regulating.

It seems like it would work substantially better for shorter names than for longer ones, though; we can never expect to fill up an appreciable portion of the namespace for longer names, and so it’d effectively be a flat rate.

It would also likely need separate parameters for each name length; the constant K that works for 3 letter names is likely too low for sensible regulation of 4 letter names, and so on.

One major wrinkle is that I’m not sure how we could actually implement this; we don’t currently count the number of domains of each length onchain, and since expiry is just a clock running out (no transaction to ‘expire’ a name) decrementing the count would be tough even if we had a counter.

Perhaps there are approximations we can use? For example, we could set price of a domain * registration-seconds purchased = constant K, where registration-seconds purchased is an exponential moving average of recent registrations of that length?

Using a registration-seconds moving average is an interesting idea, because you’re then using time as both a counter up and down. Prices go up every time someone purchases a domain, but then slowly goes down back again…

Yup. It’s definitely not as good as just counting registered domains, but under the assumption that people renew them, it’ll be a fair approximation of it.