Description
@todo
- DEP
.wellknown
Proposal - consensus might be CRDT based
- tower idea
- retrieve hypercore backed data onchain
- command cores
- service markets
- put chain source and blockchain into hypercores
- erasure codes for more efficient hypercore duplication?
- "IBC" protocol standard
- check ResNetLab
- https://github.com/protocol/ResNetLab/tree/master/3DM_RFC
- https://github.com/protocol/ResNetLab/tree/master/OPEN_PROBLEMS
- check https://github.com/protocol/ResNetLab/tree/master/3DM_RFC
- check https://github.com/protocol/ResNetLab#research
- check https://github.com/protocol/ResNetLab#projects
- check https://github.com/protocol/ResNetLab#rfps
- check https://github.com/protocol/ResNetLab#collaborations
- check https://github.com/protocol/ResNetLab#publications-talks--trainings
DEP .wellknown
Proposal
declare "wants" for pinning in .wellknown
- it includes
extrinsics signed with authors dat secret key
- or a supporter registers for them
- could be used by many pinning/hosting service, not only datdot
- anyone can collect those hypercores and PUBLISH them (e.g. maybe integrated into BEAKER BROWSER?)
- PUBLISHING them adds them to the datdot system, but they aren't necessarily
command hypercores
CRDT consensus mechanism
so - currently there's a bit of logic you do on "importing" a block on the network that you use to control how blocks change storage etc. in "blockchain world" that means you keep storage for all prospective forks of the chain until finality or whatever and then prune them. but in CRDT world the logic is slightly different
the idea is we could replace the "blockchainy" logic in block importing, and replace it with CRDT logic
would mean there would be no forks, but instead you could have divergent changes and one set of changes gets "forgotten" potentially, but it's useful for certain usecases=> it would be a new consensus/block production mechanism.
substrate has pluggable consensus, I already migrated from Aura to Babe, this would be a from-scratch mechanism though
see: https://github.com/ipfs/notes/tree/master/CRDT
and: https://github.com/ipfs/notes/blob/master/CRDT/json-crdt.md
TOWER IDEA
- everyone has their own currency => hypercores as "blockchains" with single authorities
- publish extrinsics/transactions/commands to hypercores
@IDEA:
- anyone can follow a HYPERSWARM (=
hypercore
with somebody as theowner
) - and RELAY that owners transactions onto their own TOWER (=
transaction hypercore
)
- anyone can follow a HYPERSWARM (=
- don't trust payments in someone's currency, unless
- it is verified by them
- you trust them
data retrieval
USE CASE: retrieve data from a known hypercore onchain (e.g. to be used in a smart contract pallet)
// retrieve hypercore backed data onchain
const vaultAPI = require('datdot-vault')
const serviceAPI = require('datdot-service')
const chainAPI = require('datdot-substrate')
// ------------
// SUPPORTER
const signature = sign(secretkey, nonce, hypercore_address)
chainAPI.requestData(publickey, signature, [[hypercore_address1, range1], [hypercore_address2, range2]], event => {})
// ------------
// RETRIEVER
const { account: { publickey, secretkey }, sign } = vaultAPI.account()
chainAPI.offerRetrieval(publickey, function onRetrieval (event) {
const { addressranges, author: { publickey: author_pkey } } = event
addressranges.forEach(({ address, range}) => {
serviceAPI.followHypercore(address, range, function onchunk (chunk) {
const signature = sign(secretkey, nonce, chunk)
chainAPI.submitData(publickey, signature, chunk) // + Merkle Proof
// RETRIEVER needs to pay `transaction fee`
// => RETRIEVER get paid for doing it (incentivisation)
// => BACKERS pay for it
// => FRIENDS are RETRIEVER who pay for it
// => or COMPANY (thus not for free)
})
})
})
// => datdot has fixed changing set of RETRIEVERS
RANDOM RETRIEVER vs. CHOSEN RETRIEVER
=> RANDOM RETRIEVER gets paid CHAIN and CHAIN gets paid by BACKERS
=> CHOSEN RETRIEVER does it for free or gets paid by BACKERS directly
=> like e.g. offer a service to BACKERS and then chosing oneself to do it and get paid
command cores
SUBMITTING EXTRINSICS (for free) via HYPERCORES - upload info (dats)
USE CASE:
- User X doesn't want to interact with or run a datdot node
- User X is PUBLISHER of a dat/hypercore that is being pinned on datdot
- User X is AUTHOR and can append an extrinsic to the hypercore which is parsed and executed onchain.
- => X does not use
datdot
but publishesdat/hypercore
which ishosted/pinned
by datdot - => X can append
valid extrinsic
to hypercore, which is then:parsed & executed on chain
- => X does not need to touch other rpc endpoints or run a datdot node
- => X can do stuff like:
registerFriends()
registerHypercore()
submitProofsForChallenge()
transferRatio()
...but without having to use anything but **the swarm** and **their pinners**
- not SUBSTRATE reacting to DATS
- but DATS send information back to SUBSTRATE
- => X has
a dat folder as OUTBOX for commands connected to datdot instance
- => X does not use
ALTERNATIVE
PEOPLE would just subscribe to CHAIN EVENTS
- => but PEOPLE are interested in ALL the HYPERCORE CHUNKS if they are VALID EXTRINSICS
RELAYERS - way simpler than ppl subscribe to hypercores and register callbacks
- HOSTERS (& maybe ATTESTERS) submit chunks to verify on CHAIN
- BACKERS pay (in part for
EXECUTION
ofVALID CHUNK EXTRINSICS
) - AUTHOR can submit
VALID EXTRINSICS
(e.g.
any call to runtime functions)
USE CASE: especially cool if smart contracts
exist
submit extrinsics for free via hypercores
0. a PUBLISHER
publishes a command hypercore
- a
RELAYER
can monitor and submit foundvalid extrinsic chunks
- a
RELAYER
gets paid fromBACKERS
- if
CHAIN
figures out a hypercorechunk is valid
- has
a proof
- is a
valid extrinsic
- then
CHAIN
accepts it as an exstrinsicsigned by the authors pubkey
- so
merkle roots
don't really matter in this case
- and
CHAIN
executes it onchain for free???
- who pays?
so the datdot chain would be a marketplace where people can requestRelaying
and others could discover and then take on that job
and every time a valid extrinsic properly signed by the author is submitted,
they get paid and the author or any supporter gets charged
const vaultAPI = require('datdot-vault')
const serviceAPI = require('datdot-service')
const chainAPI = require('datdot-substrate')
// ------------
// SUPPORTER
const signature = sign(secretkey, nonce, hypercore_address)
chainAPI.requestRelay(publickey, signature, hypercore_address)
// ------------
// RELAYER
const { account: { publickey, secretkey }, sign } = vaultAPI.account()
chainAPI.offerRelay(publickey, function onRelay (event) {
const { address, author: { publickey: author_pkey } } = event
serviceAPI.followHypercore(address, function onchunk (chunk) {
if (chainAPI.isValidExtrinsic(chunk)) {
const { extrinsic, method } = chunk
if (method === 'wrap') {
const signature = sign(secretkey, nonce, extrinsic)
chainAPI.submit(publickey, signature, extrinsic) // as AGENT
// RELAYER needs to pay `transaction fee`
// => RELAYERS get paid for doing it (incentivisation)
// => BACKERS pay for it
// => FRIENDS are RELAYERS who pay for it
// => or COMPANY (thus not for free)
} else if (method === 'author') {
const { signature, data } = extrinsic
chainAPI.submit(author_pkey, signature, data) // as AUTHOR
// AUTHOR needs to pay `transaction fee`
// => RELAYERS get re-imbursed off-chain by AUTHOR or BACKER
} else if (method === 'RELAYER') {
const { signature, data } = extrinsic
chainAPI.submit(publickey, new_signature, data) // as RELAYER
}
})
})
// => datdot has fixed changing set of RELAYERS
// => chain needs to verify the extrinsic based on the public key, because merkleroots don't help here
RANDOM RELAYER vs. CHOSEN RELAYER
=> RANDOM RELAYER gets paid CHAIN and CHAIN gets paid by BACKERS
=> CHOSEN RELAYER does it for free or gets paid by BACKERS directly
=> like e.g. offer a service to BACKERS and then chosing oneself to do it and get paid
additional stuff:
async function command (ledger, message, done) {
const { ID, data: { chunkIndex, chunkHash } } = message
const feed_key = await ledger.get(`/accounts/${ID}/pkey`)
const length = await ledger.get(`/accounts/${ID}/digest`)
const digest = await ledger.get(`/accounts/${ID}/length`)
const merkle_root = { digest , length }
const random_provider_IDs = getRandomOnlineProvider(3)
emit('event', {
type: 'fetch_and_execute',
random_provider_IDs,
chunkIndex,
feed_key,
merkle_root
})
}
async function submit (ledger, message, done) {
}
async function back (ledger, message, done) {
const { ID, data: { chunkIndex, chunkHash } } = message
const feed_key = await ledger.get(`/accounts/${ID}/pkey`)
const length = await ledger.get(`/accounts/${ID}/digest`)
const digest = await ledger.get(`/accounts/${ID}/length`)
const merkle_root = { digest , length }
// ....
}
service market
Every market service
makes a separate service economy
- e.g.
hosting service market
- e.g.
command RELAYER market
Because it is hard to compare "apples" to "oranges"
put chain source
and blockchain
into hypercores
-
datdot allows
BACKERS
andSEEDERS
to keep thechain source
andblockchain
hosted. -
anyone can
seed
thechain source
andblockchain
themselves if they want to trust themselves -
adding blocks to the chain might then need mechanisms like:
- BLS threshold signature?
- shamirs secret?
- ..or something to make the block signing of validators a multisig thing?
- https://github.com/poanetwork/threshold_crypto
-
so if some technique can be used so validators can author new blocks (=chunks in that case)
-
so if the entire code of the chain can be stored in a hyperdrive (=version controlled)
-
then a change to the code via Pull Request could be made
-
or a fork of the chains source code hyperdrive
-
...but that itself is all
hypercores
so the chain source code hypercores can be pinned by the chain -
all this could be client code triggered by watching changes to hyperdrives or hypercores :-)
-
approach
- we can have it mirrored in multiple hypercores, or each block producer could put blocks they produce in a hypercore signed by the same key they use for block production - we know who they are (the validator set) so we can join swarms for them everything else is already part of the chain, unless you are saying we should replace the chain db with hypercores entirely 😅
- so the runtime code is already stored onchain - that would be included in storing/distributing blocks as part of hypercores but storing source in a hypercore as well would be really cool
erasure codes for more efficient hypercore duplication?
modules
- https://github.com/ronomon/reed-solomon && https://github.com/scality/ecstream
- https://github.com/IagoLast/jqr-reed-solomon
- https://github.com/akalin/intro-erasure-codes
- https://github.com/richardkiss/js_zfec
- https://github.com/ianopolous/ErasureCodes
- https://github.com/rayje/ECvR
- https://github.com/LayrFS/Layr
- https://github.com/erasureprotocol/erasure-protocol/tree/master/packages/crypto-ipfs
- https://github.com/scality/eclib#readme
theory
- https://news.ycombinator.com/item?id=19247633
- https://news.ycombinator.com/item?id=19249405
- https://arxiv.org/abs/1809.09044
- https://arxiv.org/pdf/1307.6930.pdf
- https://en.wikipedia.org/wiki/Tornado_code
- https://en.wikipedia.org/wiki/Reed%E2%80%93Solomon_error_correction
- https://en.wikipedia.org/wiki/Fountain_code
- https://en.wikipedia.org/wiki/Raptor_code
- maybe some inspiration can come from https://github.com/ethereum/research/wiki/A-note-on-data-availability-and-erasure-coding
- https://docs.min.io/docs/minio-erasure-code-quickstart-guide.html
"IBC" protocol standard
add smart contract like exchange and/or cross-chain connection standard, so other chains which implement it (e.g. other chains which use datdot
(e.g. as a substrate pallet
)) can connect and at the bare minimum, exchange ratio
across chain and eventually take care of exchange rates and other things, so there is no need to go through a centralised service.
- also support if
chain B
wants to sendratio from chain C
tochain A