Thursday, November 30, 2017
An n by n grid may be randomly populated with symbols, and this may be considered to constitute a private key K. By reading the rows and columns to derive the 2n words (each n symbols long) in the grid, one can create a list of 2n, n-length words, which is publicized as the public key P (after first sorting alphabetically so as to remove comprising information in the order.). Let S be the signature space, where each s ∈ S is the selection of n/2 columns and n/2 rows of K. Let F be a mapping of the message space M to S. The signer creates a signature for m∈M by computing s=F(m), and the verifier confirms by checking that each word in s is in P, and satisfies the constraints of a valid crossword puzzle.
One can imagine using a larger number of dimensions, variations using a different number of symbols, using rectangular as opposed to square grids, or using combinations of different sizes. But binary symbol spaces, a dimensionality of two, and combination size n/2 seem like reasonable choices. Mathematical validation of these parameters would be worthwhile. A significant advantage of this signature scheme is that crossword generation is known to be NP-complete.
(One would have to be very careful but, just possibly, bloom filters could be used to compress the public key at the expense of signature size. The false positive rate would have to be assessed carefully, as would the assumption that the false positives are distributed in ways inconvenient to adversaries.)
(Please notice that the signatures are intentionally revealing information about the private key and therefore, this is best suited to single-use applications, or applications where you can pregenerate a large number of keys, or applications where you can sign a new public key each time you also sign a message, or applications where the decrease in security that occurs with each signing is somehow useful in the incentives that you wish to arrange for the signer.)
Purchasing cryptocurrency is like purchasing land in, say, silicon valley, which may later be leased or sold to entrepreneurs who would like to build their businesses there. The ability of that land to facilitate the subsequent business activity it hosts, is the source of the property's underlying value.
Come to think of it, USD is also just a protocol, and the value of that USD also, like land, follows from the ability of that USD to facilitate the subsequent business activity which it hosts.
Thanks to Willy Woo for the analogy to real estate.
Tuesday, November 28, 2017
In this manner, a competitive market could be created for the automatic upgrading of a blockchain's cryptographic primitives over time - at least as utilized in transaction scripts. But if not restricted to cryptographic primitives, perhaps this mechanism could also facilitate the on-chain standardization and improvement of reusable smart contracts.
Monday, November 27, 2017
But there is an unexpected benefit to being second. I now enjoy the cheap thrill of seeing the idea put into practice without having yet done any work to promulgate the technique:
I am grateful to Adrian for telling me about this application, and for pointing me towards several related concepts. As TESLA is already seeing use, it is with greater confidence that I can submit it for consideration in other contexts. In the Galileo application, the performance advantages dominate - but I am particularly interested in the quantum resistance advantages of cryptosystems based on swappable one-way functions.
DCKR / DHKR writeup:
Sunday, November 26, 2017
Tuesday, November 21, 2017
Sunday, November 19, 2017
Friday, November 17, 2017
Why this matters: I don't care how patient you are, SHA-256 is just not something you'll be doing with pencil and paper anytime soon. There are checksums you CAN compute with pencil and paper, but SHA-256 is not one of them.
A common dilemma for hardware wallet users who don't trust general-purpose hardware, is how can they backup their private seed. One common approach is pencil and paper, but then you need to make many copies to prevent loss (e.g. if your house burns down), and you need to find physically secure places for those various backups. Your ability to recover will be contingent on your continued ability to physically access one or more of those backups. But there is a massive amount of evergreen data on the internet, and some of it is even usefully random (e.g. transaction IDs in the bitcoin blockchain). One can mix data in arbitrary ways via one-time-pad-like operations using pencil and paper techniques. In other words, your backup could effectively be available to you online. If you are doing something even slightly secure, then there would be a lot of pencil and paper work necessary anytime you need to recover, but recovery is a rare event. Assurance, more than convenience, is what matters for such a backup. By accessing large amounts of random data, most of it irrelevant to your operation, you can even defend against surveillance more easily than you could if you had to somehow write specific predetermined data to places that will be backed up and widely available to very high assurance for a very long time.
You can almost, but not quite, do that today with your typical BIP 32 hardware wallet. All that would be needed would be to choose a checksum designed for use with pencil and paper. Or even better - an optional hex bypass during recovery that does not use a checksum.
Thursday, November 16, 2017
The recent Ethereum hack involving a smart contract bug illustrates a type of vulnerability that we'll be seeing a lot more often.
One can think of smart contracting as a form of programmable protocol, where protocols are themselves customized for specific uses in much the way that programming languages have traditionally customized local computer behavior for specific uses. Although protocol vulnerabilities have certainly existed before, our security, testing, and trust models today are optimized for protocols that are 'hard coded' by some standards organization prior to widespread deployment. Software defined radio, smart contracting, and likely other future developments will require that we start thinking about protocol vulnerabilities more like we think about software vulnerabilities today. While clever protocols increasingly distribute trust and mitigate damage from compromise of individual machines, the protocols themselves will become the target of choice.
One way to help is to design protocol definition languages with useful provability features. We need more research along these lines:
Wednesday, November 15, 2017
Here is a piece I wrote some years ago, although I didn't type it up until more recently:
If you spin the wheel to one "secret" position and leave it there, then you can reproduce the same cipher that Julius Caesar purportedly used to protect his military secrets. This cipher is also the same cipher used by "secret decoder ring" toys. It is sometimes known as a shift cipher, or a Caesar cipher. It is very easy to break. Some children might enjoy trying to break it.
On the other hand if the key is securely random, can include any symbol on the disk with equal probability, is as long as the message such that every letter of plain text can be encoded under a different random setting, and if you only use the key once, then you are using a cipher known as the one time pad. The one time pad is I believe the only provably secure cipher so far discovered.
This template might be fun for use with children, since it can work on English messages where capitalization is not important. "SP" stands for space.
This template is useful for working with hexadecimal data. If you find yourself working in hex with paper instead of a computer, then let's hope you have reason for such a high paranoia setting. (Perhaps... cryptocurrency key derivation from public persistent data?)
And here's another template useful for working with passwords:
(If anyone wants SVG source, or python for generating the SVG source, let me know.)
I think the future shape and role of cryptocurrency will take some decades to emerge. I think the world will eventually settle on a small handful of cryptocurrencies, with one having a clear dominance over the others. I give this scenario about 60% odds, but I have little idea how long it will take to manifest. I think it will emerge first in politically unstable or fiscally undisciplined countries, and in countries who today use fractional reserve on any fiat that is not under their control. It will next become the global standard for international trade. whereupon the US will suffer for having lost its unique position as controller of international trade money. But other countries will not mind the loss of America's privileged position, and this improved fairness will partially motivate the switch. Once that transition becomes evident, the US will suddenly regret its debt. We might suddenly and dramatically have to inflate our way out of that debt, which will initiate the switch of the US, and then the other stable countries, to finally adopt the dominant cryptocurrency. If the US Fed is wise, it might attempt to delay that switch by issuing its own central bank cryptocurrency that is very specifically NOT offered as payment for legacy US debt and whose value will therefore be allowed to float, but the success and duration of that is hard to predict.
The reason I think one cryptocurrency will dominate is due to the simple and numerous network effects. But which shall it be? I think the value will be initially derived from bitcoin, yet it will run on a different yet-to-be-invented blockchain that does not use proof of work. Conceivably, the transition off the legacy blockchain onto another will be facilitated and encouraged by the core devs of the legacy technology, but possibly, it will not be. (segwit2x was not it however, nor would the new blockchain look anything like segwit2x, from a technical perspective.)
The transition from the legacy blockchain to the new one(s) will be initiated via some sort of sidechain / cryptographic two way peg strategy, but with the peg eventually and intentionally severed so as to retire the old blockchain. The retirement of the old chain will be necessary so as to separate the money supply from the proof of work.
It is possible that a gradual transition as described above, which starts out looking like a pegged sidechain and ends up looking like a hard fork after completion, is not what will occur. Possibly, the community will just jump straight to the in-your-face hard-fork strategy, as 2017 saw more than once. But I hope not, because contentious hard forks, sans smooth transition mechanisms, are irresponsible and unnecessary.
History makes fools of most forecasters, and history may treat me no more kindly than others. But perhaps the odd fellow exists out there somewhere who would enjoy reading this, as much as I did writing it. Whatever the future holds, we will want to be nimble.
Monday, November 13, 2017
Any discussion about "appropriate" price for a unit, e.g. a bitcoin, must first explore "appropriate" market capitalization. The division of net money supply into units (e.g. bitcoins or satoshis) is as arbitrary as the division of a company's market capitalization into shares. The only economically meaningful unit is the company as a whole. In exactly the same sense, the only economically meaningful unit for a cryptocurrency is that cryptocurrency's market capitalization as a whole - and so this is where any analysis must start. At the end of the analysis, there is a mundane point of dividing capitalization by money supply so as to translate into a unit price, but that is of no economic significance whatsoever. Any analysis which starts out by examining the value of a "bitcoin" or a "satoshi" is therefore absurd. I wonder if this strangely common oversight is because we are accustomed to fractional reserve banking systems, where (unlike cryptocurrencies) the money supply is difficult to pin down.
But even that is not the start. No analysis should discuss "appropriate" market capitalization of a specific cryptocurrency until first considering the appropriate capitalization of cryptocurrencies taken together. After doing that, they should then formulate some argument as to what share of that net capitalization should constitute a specific cryptocurrency of interest (e.g. bitcoin). One might keep that pithy for example by saying that network effects should allocate the vast majority of net cryptocurrency market capitalization to just one single cryptocurrency, and then make the case that this cryptocurrency is the specific one under discussion (e.g. bitcoin perhaps due to first mover advantages). But some such claim should be made, as the validity of the analysis which follows must rely on it.
And how can one consider appropriate market capitalization of cryptocurrencies as a whole, without asking about viability? There are numerous existential risks which cryptocurrencies must survive, e.g. specter of cryptopocalypse (especially as quantum computation comes online), instability of code and protocol governance (especially as cryptographically naive money pours in), transition to palatable / scalable energy profile, continued acceptance by world governments, economic stability of competitive market among denationalized currencies (as first discussed by Hayek in 1976), etc. Some odds must be placed on long-term viability of the cryptocurrency space so that all subsequent analysis can be discounted accordingly.
I can't imagine that each article could reasonably address all of these points directly, but there should be articles addressing each in turn and they should reference each other. For example, any article saying bitcoin is presently under or overvalued should reference articles spelling out the underlying assumptions to which that particular author subscribes, so that there can emerge coherent schools of belief that can compete over relative merits.
If one did want to ignore the fundamentals entirely and only look at price movements, then one must first argue that the fundamentals are sufficiently stable and accounted for by market price to not dwarf other considerations. That seems like a difficult argument to make with a straight face. But if one were to take this approach, then all price movements should be considered on a logarithmic scale, as the calibration of units is itself one of the points being explicitly ignored.
Any price analysis which does not directly or indirectly (e.g. by reference to other articles) address these fundamentals would seem as credible as a roll of the dice.
Thursday, November 9, 2017
Feedback is welcome. If anyone knows of similar proposals, or even just anything close enough from a functional perspective to merit inclusion in an appendix, I'd love to hear about it.
Update: instead of linking to a specific version of the paper, here is a link to a page about DCKR which I'll update with new versions of the paper, diagram, or other material. Otherwise, all of my posts would soon just be about DCKR. Check this page periodically for the latest:
Wednesday, November 8, 2017
I should also suggest a few tweaks to the original version either with or without use within a block chain construct. First, it should not be necessary to include previously used keys with the message being signed. Second, it may be useful to have several parallel chains of keys, scheduled to be reveled at different intervals, so as to simultaneously accommodate a range of timeliness versus latency requirements. Also, more sophisticated implementations might find it useful to include a mechanism for adjusting the release schedule of one or more chains of keys, or for scheduling the transition to or addition of new chains of keys, possibly using different hash functions - simply by precommitting to these changes within the signed structures.
To temporarily delegate signing authority, you simply share a future key with the individual, organization, or device (etc) which you wish to sign on your behalf. The date of the future key which you share is automatically the date on which that delegation expires, because the entity to which you have delegated signing authorities can derive earlier signing keys as required - but not later signing keys.
What assumption then does DCKR make for derivation of asymmetry? It assumes time travel is not possible. If anyone invents a time machine, we might have to reassess utility of DCKR as well.
(DCKR further relies on a cryptographic hash function, so irreversibility of that hash function is also necessary. But we generally rely on existence of such a function anyhow, so the goal is to not introduce additional attack points into such systems. Also, we have a larger number of hash functions to choose from - so we can more readily swap out one function for another should a vulnerability be found.)
Perhaps I could sign the text of each post with GPG. Most visitors won't care, and those who do could authenticate posts without having to trust such a giant list of CAs that they have probably never vetted. But what happens when the asymmetry assumed by GPG is lost, perhaps due to quantum computers?
Perhaps I'll experiment with my own DCKR proposal. I could publish a script for hash-based signing of blog contents on this same blog, and then I could publish a DCKR signature stream at a regular interval. Perhaps I should practice what I preach.
Tuesday, November 7, 2017
Sorry Devops199, we know you didn't mean to destroy $300 million in ether.
I've posted before about how Ethereum takes risks in its design, although I'm usually thinking more about its use of cryptography. Well this particular $300 million issue was NOT a bug in Ethereum, it was a bug in a smart contract. One might say we should not blame Ethereum any more than we should blame Bitcoin for Mt. Gox. But on the other hand, Mt. Gox was possible because it wasn't using the blockchain to secure its funds, whereas those $300 million disappeared through the use of Ethereum-supplied smart contracting facilities, and I gather it affects users who hold their own private keys. This one was not a custodial risk issue. (Unless you count the coding itself.)
In the end I do not blame this on Ethereum, but I do think Ethereum should develop safeties and proveability in its contracting language.
I recently posted a proposal for delayed chained key revelation (DCKR). I should point out a couple details which might or might not be obvious.
First: it is necessary for the public to have some rough idea of the maximum age of a revealed key. If an asserted chain has a last key of maximal age A seconds at time of first awareness, then a user must wait until they have knowledge of a key that was publicly revealed only after more than A seconds later. Otherwise, it would be possible that the chain they are confirming was fraudulently created using knowledge perhaps not yet known to the user, but knowledge that was already public nevertheless.
There are many ways to do this. One of the simpler ways is for the signer to simply precommit to a schedule of earliest release by key index.
Another question is what to do when the signer runs out of pregenerated keys. This is also easily addressed: the signer could include the last of a new chain of hashes in the previously described signed structure as they near the end of a prior chain.
Although I have only described chains so far, trees could also be used so as to conceptually mimic BIP 32 run backwards (but without asymmetric algorithms in the traditional sense.)
Asymmetric cryptography is a slightly dodgy proposition today, since all known asymmetric cryptosystems require introduction of at least one additional unproven mathematical assumption. Unproven assumptions are targets, and we like to minimize target area - especially as we head into the age of quantum computation. So as to minimize this target area, can we somehow introduce useful asymmetries into more conventional symmetric cryptosystems? Why not use the dimension of time to do so? Perhaps the current knowledge of not-yet-revealed information can be leveraged into useful asymmetries.
Consider, for example, a Proof of Stake algorithm with functional goals similar to Ethereum's Casper. Suppose we wish to use a symmetric signature algorithm, instead of an asymmetric cryptosystem. Before registering as a verifier, one could privately generate some securely random data, and then precompute a very long chain of n hashes using a cryptographic one-way hash function. At time of registration, only the very last of these hashes would be publicly revealed. Call that last hash H(n).
Now when signing the first block, the verifier would use H(n-2) as the signing key, and would sign the combination of the block plus H(n-1), and would additionally reveal H(n-1). When the next block is ready, the verifier would sign the combination of it with H(n-2) using H(n-3), and would publicly reveal H(n-2). Etc.
In this way, sufficiently old signatures could be verified by anyone using publicly available data. While it is true that anyone could produce a fraudulent chain of old data that would be plausible up until publication of next block, it is also true that the fraudulent chain's credibility would have to expire - since the true verifier had not revealed enough information publicly for anyone to commit future fraudulent signatures in this chain. A new participant in the network thus simply collects candidate chains, and then waits to see which (if any) maintain their plausibility past the time of collection.
I recognize this explanation might be difficult to understand without some diagrams. I apologize for that. I'll write up a more reader-friendly whitepaper in the near future. There are many parameters one could tweak, and there may be other applications. The most general concept here is simply the use of not-yet-revealed knowledge to generate other cryptographically useful asymmetries.
This is the first proposal I have seen for a smart contracting language carefully designed for certain useful proveability characteristics. When designing a smart contract, it does seem important to know in advance how much computational load it will require!
There seems to be some excellent research coming out of Blockstream.
A good friend recently sent me a link to an article estimating the (very large) amount of electricity consumed by Bitcoin's Proof of Work algorithm. In the message, he said "fix this."
My primary concern to date has been the economic risk of transitioning a large amount of economic activity onto cryptocurrencies, only to have the cryptographic foundation collapse very suddenly and spectacularly. This is why I have been probing the potential vulnerabilities in that foundation, and looking at ways to mitigate. If cryptocurrencies are to fractional reserve fiat banking as airplanes are to cars, then I've been looking at the risk of crashing. But what about fuel economy? It is difficult to optimize a thing before you have finished inventing it. On the other hand, network effects can lock in unfortunate design decisions. Perhaps it is not too soon. Perhaps these questions can't be asked soon enough.
The lightning network and other off-blockchain scale mechanisms will inevitably improve efficiency. Those are going to happen. But can we do better than pure proof of work for the underlying blockchains? What about Proof of Stake? I don't like Casper's exposure of public keys and the implied failure modes should the asymmetry one day be lost - which brings us back to the economic risk of crashing our plane. One might think of running BIP 32 backwards to protect public keys, but that would contain a subtle but serious vulnerability. But then, BIP 32 was not designed for this application and was not intended to be run backwards. Perhaps we can design a better solution. Hmm...
I'll tag posts related to economic and environmental responsibility as "responsible cryptocurrency", since the two considerations cannot be addressed separately.
Monday, November 6, 2017
This is just one example of why I like the cryptographic conservatism that I tend to find in Bitcoin, such as the habit of not disclosing public keys until time of use, and the habit of only using them once. I would not be surprised if quantum computers throw similar surprises our way.
(I wish Ethereum's new Casper PoS algorithm was similarly conservative.)
Apparently, there are very wealthy individuals trying to secure their Bitcoin via underground, physically secured vaults.
I understand the urge to hide valuables in the physical world. Virtual material however can also be hidden in the virtual world. When hidden in (or derived from) public material whose availability and integrity is already being carefully assured for other reasons, you get to leverage all of that integrity and availability, for free.
Of course, you should always assume your data communications are being monitored when you are accessing relevant online material. The techniques you employ must account for that possibility.
As the Bitcoin community braces for a needlessly hostile fork, I'd like to remind future would-be forkers that there is a cool concept known as a two-way peg. Unless your gripe involves security of the money supply itself, then chances are pretty solid that a two-way pegged sidechain would allow transfer of genuine Bitcoin to and from your possibly superior blockchain tech, such that the market (i.e. users) can fluidly transfer to whatever mechanism best fits their use cases.
Conversely, I'd encourage Bitcoin core to do whatever they safely can to support and encourage sidechains. This concept holds potential as a safer, lower friction and more pro-social mechanism for blockchain evolution, and could moreover serve as an important safety valve.
In 1934, Carl Menger made a simple mathematical error that no one caught. His (incorrect) 1934 paper has been celebrated by Nobel laureates ever since. The result of this error has been to obscure from economists certain objective facts of risk management and wealth distributions, forcing them to rely on subjective mechanisms like utility functions and risk aversion instead.
The physicist Ole Peters has found the mistake. Now all we need to do is to, somehow, get economists to listen. Ole Peters apparently got Ken Arrow to change his tune, but unfortunately Ken Arrow is no longer with us to help champion the message.
In preparation, it seems prudent to develop open taxonomies which classify cryptocurrencies (and other applications) in terms of the cryptographic primitives on which their mandatory and optional features are based - and that also in turn classify those primitives by the mathematical assumptions on which they are based. Such a resource might help the community to more quickly, intelligently, and transparently respond to cryptanalytic surprises as they present themselves.
I do not see as much preparatory homework being done as I'd expect, and that makes me nervous.
Since we do not have the letter "å" in English, I suppose it was transliterated to the English letter with the closest sound.
If your native tongue leaves you wondering why I would bother explain the origins of "Mord", then... never mind.
Bitcoin is the first example of a new class of technology that is so fundamental, that it has yet to be properly named. Blockchain, smart co...
(For sake of expediency, the following will be rather dense and presumes some background in both electricity markets and the Lightning Netwo...
In 1976, Nobel laureate Friedrich Hayek proposed that money should be denationalized, such that privately issued moneys would compete over t...