I want to share throughts on how to measure the amount of information, that is artificially added by the transposition scheme applied in the 340 solution process. It can then be compared to the constraint that the additional length of the 340 over the unicity distance at 63 symbols imposes. If it is less information that what is worth for 340 minus unicity distance, then it must represent the one and only one solution. The question to respond is if the transposition is an artistic anagram that delivers a crafted solution, driven by the brute force of modern computer power into a riddle of different original intent. Or if the solution is an inevitable conclusion of the given constraints.
Shannon has measured the amount of information „entropy“ in a text as the sum of p*ln(p), with p the likelihood of each letter in natural text. That measure is no help to quantify the information of a transposition, because the frequency of each letter remains conserved. Instead, it is the nature of the transposition that is alters the sequence of the letters rather than their frequency. The transposition acts on the conditional likelihood of a letter after a given previous one. Which just is the bi-gram frequency.
The question is, how much information is introduced by the transposition. In a very simple example, I demonstrate that FI is a transposition of IF, and FI has a different p*ln(p) than IF does because FI does not exist as word, while IF is a frequent in the English language. Transposing FI to IF, therefore artificially injects, or destroys depending on the point of view, information. In the most generic case, if turns something meaningless into something meaningful. Each decision that swaps the sequence of letters, I hence count as introducing the language average of bi-gram entropy.
The unicity distance for a homophonic cipher with N=63 symbols is 1.47*N-49.91+0.45*(0.5 + N)*ln(N)-0.45*(N-25.5)* ln(N-26) = 100. The 340 is 240 letters longer, providing 4.14 bits per letter according to https://www.cse.iitb.ac.in/~cs626-460-2012/seminar_ppts/NLP_and_Entropy.pdf , which equals 993.6 bits.
A bi-gram has 7.7 bits of information, 3.56 more than a single letter. „Optimizing“ the sequence of letters in a transposition, therefore every time artificially injects 3.56 bits. That would allow an impressive 279 swaps until unicity is lost, and that explains why Robert Graysmith still had difficulties to make a meaningful match with his fairly free anagrams.
At first glance, the 340 transposition swaps almost every of the 340 letters. The key difference is that the applied transposition is highly redundant, and much less than 340 swaps. It can be described as a simple pattern sequence of 19 letters length, every letter progressing by a count of 18, repeated 18 times. That is only 3 pieces of arbitrary information, worth 3*3.56=10.68 bits.
Which leaves no room for artistic freedom, and no doubts: The solution must be unique by formal statistic arguments.