(pRFP) an IPFS .eth websites pinning software


This pRFP calls for building a pinning software for IPFS .eth websites.

A subsequent RFP for operating a community service based on this software will follow upon the completion of this one.

The motivation is that IPFS .eth websites need to be pinned in order to be accessible. It seems that most IPFS .eth websites use commercial centralized pinning services. This dependence on centralized services prevents the .eth web from being independent, decentralized, and autonomous.

With the software described in this pRFP in place, anyone from the community could easily participate in the pinning of the .eth web. It will also enable the DAO to support community services for pinning the .eth web.

This will drive the adoption of .eth websites by making them more stable and reliable for visitors, while easier for builders to maintain .eth websites for the long term.

See also: temp check by @gregskril for this idea.


Most .eth websites have their content served by IPFS. For the content of these .eth websites to be available, at least one node needs to save the content and be always online to share it in IPFS network. In IPFS terms we say that “the node pins the content”.

For a .eth website to be accessible, at least one node needs to pin its content.

The problem

Currently, most people and projects use commercial pinning services to pin their .eth websites, such as Pinata or web3.storage. This is observed in the .eth SG calls, the discussions in Esteroids .eth websites Discord, and in the list of educational resources for building .eth websites. While it is possible to pin your own website, it is quite a challenging, and expensive, thing to do with the current state of go-ipfs software.

Greg, the teams simple.eth and Nimi, and half a dozen other people have voiced this to be a problem in a couple of .eth websites SG calls in May.

This dependency on commercial services is risky not only from the point of view of endangering the decentralized nature of the .eth web but also from the point of view of endangering websites’ “longevity”, in the sense that websites may disappear in the long term since the commercial entities pinning it will no longer exist. This damages the user experience and reputation of .eth websites visitors since often older websites are not available anymore.

The solution

To solve this problem we propose a pRFP for an IPFS .eth websites pinning software.

This software will automatically react upon the creation or modification of a .eth website on the blockchain and will pin the new website automatically. The software should be easy to install and configure on a server and enable anyone who wants to participate in the pinning of the .eth web.

To offer a complete solution, this RFP will follow later on by a subsequent one, calling for the operation of public goods services based on the .eth websites pinning software.

The outcome

As a result of this and the subsequent RFP, the .eth web will be more decentralized, stable, and robust. These results will drive the adoption of .eth websites for several reasons:

  • It provides a better user experience since websites will be more accessible.
  • For the same reason it will improve the reputation of the .eth web, making it known as the most decentralized, autonomous, and stable web existing.
  • It makes it easier to build websites since builders will not have to worry about long-term pinning.

The advantages enabled by a .eth website pinning software are in the interest of ENS DAO as they supply a stable foundation for the .eth web, and demonstrate how the DAO supports it.

Scope of Work

A proposal should include the following elements.

  1. An IPFS .eth websites pinning software. The IPFS .eth website pinning software will work as follows. It will monitor ContenthashChanged events of ENS Resolvers on the Ethereum blockchain. Once a new IPFS contenthash is set, the software will fetch the content corresponding to this contenthash and will pin it.

  2. A docker container for this software. This lower the barrier for people to participate in the pinning.

  3. Installation and operation instructions for the service.

  4. A service report generator. The software will have a feature to generate reports about the functionality of the server, with data such as how CIDs are pinned, uptime, and so on.

  5. A pin checker .eth website. On this website, users could enter a .eth name, and the website will show which IPFS CID, if any, is set to them, and which of the services the website knows of pins them.

  6. An uptime monitor. The monitor should be in the spirit of the IPFS gateway monitor.

Selection Criteria for a Vendor

The Ecosystem Working Group will field proposals from members of the public and ENS DAO members.

The Ecosystem Working Group will review the submissions, and the applicant will be selected based on the deliverables outlined below.

The final selection and the award of this RFP will be made by the ENS Ecosystem Working Group.

Award Strategy

The Ecosystem Working Group will pay the selected applicant. Each proposal should include a clear list of deliveries for the Ecosystem Working group to check upon the completion of work.

A prepay of 30% will be given up-front, with the rest given upon the completion of the work.

Estimated Budget

50,000 - 100,000 USDC

Does the ecosystem Working Group see value in this proposal? @slobo.eth .


Some discussion points I didn’t want to mention in the pRFP itself. There were two extra features discussed in the SG calls:

  • Pinning for IPNS+ens websites (and not only IPFS+ENS websites).
  • The software in this proposal pins only the last version of each website, and not all the historical versions it ever had.

We left those two features eventually to keep things simple. Once the software exists, it’s easier to add those features to it later on. But if we include it in the pRFP from the beginning, it will make the development process longer and more expensive.

Btw, @nick.eth, can we have a category for the dot-eth-web?


I have been saying for more than a year that @neiman and the Esteroids team are building tools and products on top of the ENS protocol that everyone needs but they don’t even know it yet.

It costs :fuelpump: to set the content record (ENS website IPFS hash), then it is a manual process to update the record and additional :fuelpump: for every change you make to your ENS website. This automated the ENS website updates and removes the cost of :fuelpump:.

If anything his request for funding his team is far to low.



It seems like the solution here is overly complex. Creating entirely custom software like this is unnecessary unless we know other options aren’t available, and in this case I think there is a much simpler alternative.

Creating an IPFS cluster would allow a group of IPFS nodes to pin/host synchronised content. Alongside this I think it would make more sense to allow certain service providers to contribute, as opposed to making it entirely decentralised which has a significant amount of quirks.

If it were determined that developing this software were actually necessary, I think there would still be quite a few flaws with this current proposal:

  1. Listening for ContenthashChanged events requires that a file is already pinned before it can be pinned into the proposed network. I understand your solution currently is that the user creates a CID themselves, but this is not at all a good user experience and therefore isn’t really a great solution to the problem.
  2. Listening for specific events limits the scope of proposal significantly, without benefit. It looks like there isn’t any explanation for why event listening is needed, and means that avatars and banners couldn’t be hosted on the service.
  3. The idea revolves around people hosting IPFS nodes without incentive, which isn’t a guarantee. Uptime of at least 1 node is important to make sure that existing pins are accessible, and that new pins can be made. Is there a plan for making sure that happens?
  4. I think technical details should be detailed further, it seems like some aspects are unknowns (authentication, if/how nodes sync, etc).

Good points!

I’d like to point out that there’s a second RFP planned if this one is approved:

The plan for the second RFP is for the DAO to fund 2-3 community services for pinning .eth websites (based on this software). The operation should be not expensive with such a software in place.

The problem that was pointed out by the community in the .eth websites SG calls is that pinning is both decentralized and incomplete. I didn’t understand how using IPFS cluster would solve any of those issues?

Btw, did you ever try to run an IPFS pinning service? With the current software existing it’s quite a challenging work.

We also discussed this in the .eth websites SG. The conclusion we came up with was written in the temp check that @gregskril made:

In our conversations, we realized that all of these services use Pinata’s API to upload/pin files to IPFS. This works perfectly as an on-ramp, but feels like a loss for decentralization.

Basically, we didn’t include uploading data because it makes development more complex, and it’s not necessary for solving the problems described in the RFP. It is something that we should add at a later stage to the software though, definitely.

The proposal is for pinning .eth websites, and event listening is needed to determine when a websites was created or changed. Maybe it is indeed a good idea to add avatars/banners to it; we didn’t include it just to stay focused on the topic of .eth websites.

As written in the proposal, a subsequent RFP is planned for operating services based on this software.

We thought to leave these details to the those who will bid on this proposal.

Btw, you are warmly invited to the next call of the .eth websites SG! Everything you wrote is exactly the stuff we discuss there:-)


Can you expand this? I don’t fully understand the difference.

Is this to purely for developing such software but not about the running cost?

The only part unique to ENS ecosystem is to detect all events on contenthash set which can be done with single query to theGraph subquery. Please also bear in mind that any contenthash set under wildcard with offchain resolver don’t necessarily emit events hence you cannot detect events to index.

I am not sure if this is really the problem as IPFS itself is open sourced hence we have no lock in with other commercial service providers.

I understand that it is a good incentive for DAOs to fund pinning of all contenthash set under ENS name, but feels like re-inventing the wheel to fund building such a software. The RFP should rather be funding the operation of pinning of all .eth website so that existing entities like Pinata or fleek.co can also submit the proposal.


First, I’d like to point out that this proposal is not mine personally, but came out of the .eth websites SG calls and many discussions following that in the last 1.5 months.

The temp check was done by someone else from the community.

Sure! In ENS+IPFS websites, the IPFS CID (contenthash) is set onchain for the .eth name. Any update to the website is done with an Ethereum transaction and emits an event.

With ENS+IPNS websites, what you set on chain is an “IPNS key”. This is basically a public key representing the owner of the website in IPFS network. Then to update the website, the owner of the IPNS key simply signs a new CID and broadcasts it in IPFS network. The update is done offchain.

Updates of ENS+IPNS networks are harder to catch because they don’t create an event onchain. In esteroids.eth we simply rescan all ENS+IPNS websites periodically to check for updates (though there are possible better solutions that we still need to look into).

I think that a .eth web pinning software should include also ENS+IPNS websites, but it can be added later on, and to keep it simple we didn’t put it in this pRFP.

Only for the software.

True! There is even someone in our Discord who did that and pinned everything he found using web3.storage.

… but this only gives a snapshot of the .eth web in some specific point of time, where a .eth web pinning service should be always up to date - so following the updates (using the events) seems easier.

I know, but so far there, to my best of knowledge, there is no one in the community using that.

The idea is to keep the software updated with what people use in the future, so entities who pins using it won’t have to worry about tracking the different methods people use with time.

go-ipfs on its own is not enough to run a pinning service. It’s simply not stable enough and I think was not meant for it.

The incentive for this RFP is for having software that makes it easy to pin the .eth web. Funding the pinning itself is a matter of the subsequent RFP.

Regarding reinventing the wheel, I’m not familiar with any software that makes it possible to pin the .eth web, but I’ll be happy to stand corrected. Our experience shows that running any service-based only on go-ipfs without some wrapper simply doesn’t work, since, as I wrote above, I’m not sure the current version of go-ipfs is meant for it.

This is indeed the goal of the subsequent RFP.

We could start with this RFP instead of the current one, but then you actually limit the number of entities who can submit proposals, and the number of people who can participate in the pinning without a proposal, since it requires more work to create such a service.

Edit. I thought to point out that we have experience with pinning services since we run an IPNS pinning service for the .eth web community: https://dwebservices.xyz/.


ah I see. I was rather confused by -ens and ENS, didn’t realise it was talking about the difference between IPFS and IPNS.

This is more of the problem of IPFS ecosystem and I don’t think we should fund to solve it IMHO

I agree that there isn’t any service specifically targetted for eth community, but this is more about making sure a given list of IPFS cids are updated. The list itself is a one liner script unless I am missing something critical.

Can we rather have a RFP asking for operational service that also includes one off development cost?
It seems wasteful that we spend one off $50~100k building custom software, then turned out that other commercial service can provide the same service with lower running cost.

The primary objective (if I understand correctly) is to provide free pinning service to all .eth website users (which I am supportive), not necessarily about building such software from scratch for the decentralisation purpose (as I repeat, the software component won’t have lock in as IPFS protocol itself is open source).

FYI https://github.com/ipfs/pinning-services-api-spec#online-services has the list of commercial pinning services and we can ask people from Protocol labs for their input if we need any technical guidance evaluating the RFP.


First, I’d like to clarify something. The RFP is not for rewriting go-ipfs. It’s rather for writing a pinning software for the .eth web which uses go-ipfs as one of its components.

The RFP also has other components besides this software that the SG felt the community needs, see items 2-6 in the Scope of Work.

Assuming that there is a pinning service running, and as long as we want to support only IPFS, you are almost right. The more complicated reply would be a system that monitors websites, unpin old versions, and pin new ones.

I think your “one liner script” ignores the time factor and the fact the .eth web changes all the time.

But more importantly, your proposals doesn’t fit future developments of the .eth web, like IPNS (which is already becoming popular), other resolvers than the standard one, CCIP L2 .eth websites etc. One advantage of having a dedicated software to it is for it to adjust with time.

Actually, the primary objective is not to be dependent on commercial pinning services:-) I think it was written both in this pRFP and in the temp check.

As the lead of the .eth websites SG, I’m in an awkward position in which I don’t voice my own opinion, but rather voice opinions that were made in the .eth websites SG calls, discussions, and drafts towards this RFP.

The whole discussion there began by people raising concerns about their dependence on commercial services. As such, I don’t feel I have the right to exchange this RFP with one that calls for commercial services to pin the .eth web.

(to be completely honest, if it would been only up to me the RFP would have been different - though it will still aim to be independent of commercial services - but I made my best effort to build it upon the opinions of the .eth websites community)

The original idea of the group was to have one RFP for both writing the software and operating a service based on it, but per the WG steward’s request, we divided it into two RFPs.

However, the software was not meant only for in-house usage of the future operator, but as is in this RFP, it was meant as something that anyone could use to participate in the pinning of the .eth web.


That’s more about the implementation logic of how to keep them up to date. My mention of querying the list of ipfs/ipns based contenthash remains relatively simple.

Can you clarify if the focus is to prevent reliance of “single provider” or “commercial providers”?

If the idea is to always have more than single pinning service provider (whether it’s comercial or not) to operate, then don’t we also need enough incentivisation mechanism to make sure that it provides enough uptime as a whole? That feels like we need to design something like Filecoin but that’s not included in the scope (and I don’t think we should create such things on our own).

I do understand the problem or relying on single provider. For example, .eth.link is provided by Cloudflare but it has been suffering with the instability until .eth.limo came up. I think that the simpler solution is to have SLA with single pinning service (or two if we want to provide redundancy), and switch providers whenever they don’t provide enough uptime.

If the idea is to prevent any commercial providers from submitting the RFPs, however , any orgs/individuals who provide service in return of financial compensation (including receiving grants from ENS DAO) is commercial providers by definition so we need to only rely on volunteers with no financial compensation but not sure if it’s realistic.

Can we include the RFP criteria to score highly if proposals promise to open source any tools specific to servicing .eth website with the grant?

I appreciate your effort of representing the ecosystem community’s voice.


It won’t work for IPNS, only IPFS.

Otherwise, I’m not sure with which part of my reply you actually disagree with?

The focus is to prevent reliance on commercial providers (with an ‘s’, in plural :D)

I didn’t mean that. anyone can submit a bid to the RFP.

The idea is that by making it easier to pin the .eth web with this software, we will not rely on commercial services. None of those things imply that commercial services are not required or can’t submit bids on the RFP, it simply - hopefully - will create a .eth web that does not rely only on them.

Sure! Like, as I wrote, this was originally what the SG proposed. We changed it to the request of the WG stewards. If the new stewards of this term want to merge the software creation and service providing into one RFP - then great.


It appears to me that @matoken.eth is not yet aware of IPNS and its relevance for which no pinning providers exist so far. IPNS is used by all serious .eth websites since IPFS is not meant for dynamic updates. This is not surprising since most people either do not have a .eth webpage or they host a static page on IPFS. The next step is IPNS and everyone should get to know it first. Here is some educational material: IPNS | IPFS Docs

As a dev who integrated IPNS support into our ENS web app, I am fully aware of what IPNS does. fleek.co does in fact offer pinning service via IPNS for site which deploy to .eth website via their service.

Again, pinning of IPNS service is also something many commercial IPFS/NS providers are supporting so I would assume that existing players have some solutions.


I am not disagreeing that you can’t detect change of content via IPNS. I am only saying that it’s one query the list of .eth website hosted via IPNS. How service providers update these is up to proposal from each provider.

I personally don’t agree with this sentiment but as long as the RFP is not blocking submission from commercial entities (which you mention that it won’t be the case in the subsequent comment), I won’t be arguing any more.


As you pointed out - Fleek offer IPNS pinning service for content that is deplyed by them. They also hold the IPNS private keys themselves afair.

I imagine @inplco was referring to an IPNS pinning service in the sense that you give it an IPNS public key, and the service make sure that the content associated with it is pinned at all times (even if it changes all the time, even if it was not pinned by them).

If you know of anyone doing that I’ll be happy to hear about it (really, this would be amazing to find out about such a service!)

1 Like


Actually you mentioned that this proposal exclude IPNS to keep things simple, so probably not worth debating how to support it anyways?

Yeah, I am currently enquiring Protocol labs team if such a service/tool exists, as all the problems described seems to me that it’s not unique to ENS IPFS/IPNS site (meaning that someone wants list of cids pinned without relying on single commercial vendor, in the decentralised manner if possible).

Technical difficulty part aside, one thing I still don’t fully understand is why commissioning a commercial vendor or vendors to pin all .web sites is worse than giving grants to some third party to offer similar service but the software is open sourced. Isn’t it almost equal to saying that ENS DAO should give grants to some third party orgs to run their own geth node (or dockerised easy to launch image) because relying on Infura is bad? It may not be ideal but not too bad as long as we can easily switch to other RCP endpoint provider.

Having all said that, there is a possibility that none of commercial vendors may offer such service, so if that’s the case, then I think it’s worth funding the creation of such software.

I think the focus on ease of use for the software isn’t really necessary then. Of course it shouldn’t be hard to use, but if we are funding operators they will probably have whatever knowledge is necessary.

With trusted operators, having an IPFS cluster means that we aren’t relying on any given service provider and pins can be added with minimal extra configuration. I should also explain that we will be revamping the current CID flow in ENS App v3, meaning that we would easily be able to integrate simple communication between the app and the IPFS cluster to allow pinning files directly in the ENS app itself.

It’s important that a proposal like this should not be based purely on the existing ENS app flow for CIDs, in that you just paste a hash. If we were constrained by this I would definitely think this proposal would have better usability, however that is not the case.

I assume the operators we choose would have experience regarding this, therefore I don’t think it’s that much of an issue.

If direct upload isn’t available you create additional problems. Using an external node to pin a file means that you need to somehow know when the file becomes pinned by the software so it can be unpinned by the external node. This means there needs to be some sort of communication between any given app or service and the software, which has a wide variety of technical complexities.

Looks like your previous calls haven’t been at a great time for me, but send me a DM whenever you plan the next one and I’ll see if I can make it.

A service the DAO is providing should probably support all the features of ENS, even if they have low adoption rates for the moment.

Overall I still think the proposal isn’t quite solving the problem at hand, while also being complex. If an IPFS cluster could be used, the software needed to make it fit within the ecosystem would be minimal.

Right, but if this brings more support we can include it, and hope for bids within the budget. No one in the SG opposed supporting IPNS.

Esteroids might make a bid in this case as well (though not sure).

I apologize, but I didn’t understand the question :slight_smile:

Edit: I’ll clarify. My understanding is that this pRFP needs to be turned into an RFP by the Ecosystem WG, since they approve the budget.

If the WG wants to include IPNS in the RFP – fine.
If the WG wants to merge items 1-6 in this pRFP with a call for operating a service based on this software – fine.
If a commercial entity submits bids for the RFP – really fine!:slight_smile:

What I oppose is:

  • directly contacting an existing commercial entity offering to pay them to pin the .eth web.
  • having an RFP for a commercial entity that pins the .eth web using software they wrote specifically for their settings, but can’t be used by others for the same goal (even if it’s open-source).

If after the long discussion with @matoken.eth you still think it’s possible to do with IPFS cluster, then I’m afraid I won’t be able to convince you further. I already said everything I could in this thread.

I didn’t understand the comments regarding ENS App. Did this pRFP mix somehow with TNL future plans for the ENS App?

The specific implementation your proposal sets out seems to be based around entering a CID somewhere, while in theory there could simply be an endpoint for the ENS app and other apps to communicate with.