The Ocean Protocol might provide usable infrastructure for us. At the very least, it'll be a source of inspiration around data provenance and curation.
Snippets from the Ocean Academy:
Ocean Protocol is a protocol that defines how communication between data providers and data consumers takes place in a decentralized way, either directly or via apps like marketplaces Ocean Protocol provides a complete toolbox for data scientists and developers to securely publish, exchange, and use data.
It allows data providers to securely monetize their private data in full transparency while keeping full ownership.
It allows data consumers to run compute algorithms on vetted data, which was previously not accessible (data not shared, no open marketplace to find the data, no process to judge data quality/relevance).
Ocean Market, the reference marketplace provided by Ocean Protocol [...] makes it easy to publish data services, provide accurate pricing for the data, discover data, purchase data, and consume data services.
Ocean Market supports fixed pricing and automatic price discovery for your assets (datasets and algorithms)
Algorithm Providers are called Compute Providers in Ocean. They sell algorithms, instead of data itself. By making their algorithms available, the Compute providers are paid every time one of their algorithms is used.
Algorithm Providers share their ML scripts with the market. Data providers approve AI algorithms to run on their data and then Compute-to-Data orchestrates remote computation and execution on data to train AI models while preserving the privacy of the data. Smart contracts ensure that every data provider/AI practitioner can verify proper execution of their algorithm. In chapter 20 there is more about Compute-to-Data.
The Ocean Compute-to-Data is the functionality that solves the current tradeoff between the benefits of using private data and the risks of exposing it. It allows data consumers to run compute jobs on data to train AI models while the data stays on-premise with the data provider.