Additionally, what is the ultimate scope of the control of the user over their data, as far as the actual storage location of the data itself is concerned? If a user is particular about their data being stored in a specific storage backend (S3, Azure etc), do they necessarily have to run their own hub, or is it envisaged that some hub operators could come up who’d provide guarantees about backend storage to the user?
Storage Provider is a service provider/company that runs and maintains a storage hub.
I hope to see a storage provider that combines gaia protocol and caldav/carddav protocol or even more services like provided in ownCloud. There could be storage providers that have better SLAs than others, more free storage, etc.
I don’t think this was intentional in the screenshot you shared.
Some background: we initially had some software running on users’ computers that served as translation layer to connect to storage providers such as dropbox.
We ran into some problems with that architecture - specifically, the consumer grade storage providers that we were supporting didn’t really want their product offerings being used as a “dumb drives” for large amounts of opaque encrypted data publically accessible on the internet. Making things work given their constraints proved to be overly complex.
Because of this, we switched to the current Gaia Hub model, where some software that we created is run on top of wholesale, commodity cloud storage that provides a simple and consistent public key/address based access architecture. We usually refer to those as “storage hubs”. A user can run their own hub or use a public one.
Storage hub is another term for gaia hub in this case. The hub can be run on a storage provider, where provider implies a third party company/software made & hosted by the company that the gaia hub would run on (like AWS, Google Cloud, Azure, Digital Ocean, etc). The gaia hub itself refers to the storage which should be able to run set up locally or in the cloud by a provider of your choosing.