Hello,
I have renovated my solver, and made it accessible on:
https://github.com/freichmann/cDecryptor (homophonic solver in C++)
https://github.com/freichmann/jDecryptor (homophonic solver in Java)
https://github.com/freichmann/Languages (self-compiled english n-grams up to 5)
https://github.com/freichmann/Ciphers (a collection of ciphers, mostly from zodiackillersite.com but not only)
Both C++ and Java solvers rely on the same scoring algorithm: The n-grams are counted, and anticipating a Poisson Distribution, the likelihood for occurring this many times is calculated. This means, if a n-gram occurs more often than it is expected, the likelihood is dropping again. Each n-gram Poisson likelihood is compared to what is expected from a sample text, expecting a Gaussian Normal Distribution, in reference to the results of a previously analyzed sample text. To achieve a better numerical stability, the natural logarithms are added, instead of the likelihoods multiplied.
The hill-climbing implementation is very simple: The best change is searched, then applied, and if there is none, then the so far best known solution is partially randomized, and tried from there again.
Both C++ and Java version do – sometimes – solve the 408. In general, the convergence is very slow, in the range of hours, maybe days on 1GHz quad-core low-end SoC’s (better than Raspberry, but worse than a standard PC), maybe never, it depends on the luck of the random re-initializations. The C++ solver with the supplied n-grams runs on a Raspberry Zero, consuming one Watt of power. The solutions work best (or: work only) with loading the 1-gram (that is a must), and the 5-grams only. The problem with the hill-climber is that it is permanently stuck in local optima, until it finally has a lucky start and makes some progress again. I have not got this well under control yet.
I still plan to improve, but will want to keep the code minimum limited to only the functional part, code readability is more important to me than speed, or comfort of usage. And I want to keep it free from "magic numbers" for e.g. building a weighted sum or product of different scores (like entropy, IoC, n-gram count), as long as these are not either obvious or lack a statistical or other theoretical justification, even if they work better. My goal is to remain free of unintentional bias.
The original idea of my code was my speculation that Zodiac might have used a cipher disk, in order to build a complex scheme in a pre-computer era. The code no longer focusses on the idea, neither I got good results, nor it matched the forum’s expectation here that there would be a signal in self-correlation at cipher disk size.
The code is intended for people with a background in software development. Enjoy! If you have feed-back, ideas, suggestions, please let me know!
Thanks for sharing f.reichmann.
The problem with the hill-climber is that it is permanently stuck in local optima, until it finally has a lucky start and makes some progress again. I have not got this well under control yet.
How do you accept solutions? Something like: if new_score>old_score then accept? If so, try to subtract a small value "magic number" from the old_score every time a solution is not accepted. That should help allot with it being stuck in local optima.
Thanks for updating your solvers and informing us, f.reichmann.
I refreshed my checkout of the cDecryptor source and recompiled it. I’m trying to run it on the 408 but it’s not finding the solution yet. Here is the invocation command I’m using:
./cDecryptor -t 4 -r 0.25 -g 0.001 -w ./Languages/English/sample.txt -l ./Languages/English/1-grams.txt -l ./Languages/English/5-grams.txt -c ./Ciphers/408.zodiac.txt
Is that correct? Or do you recommend different parameters?
The best scoring solution so far is:
Tue Oct 6 21:37:47 2020 Thread 1 -22.9617 5:-37.1569 1:-11.8459 ssnotocosmonirriotalethehsenuerehenshecampetidehoindiostononicsandermeantsteareeatethuletoseanhertiedindiahrdotetotedtoatontourerencennmateerantertianestononlyihadiercemetaroatailiandenheasonstheateourseinintncisandareeitearsmistanddintinecrmistoatalationeidedacehpoetssandmndarossstaintoareretesettalusisosotincalsthesedetatethersthisstaisereorseriotapariesetttheinonreestatustiestsaleonetharrestaherennicam
Hi Jarlve & Doranchak
thank you both for your replies.
For the acceptance: Correct, Jarlve, I was using a simple score comparison, choosing the best, and when getting stuck, reset part of the solution. Unfortunately, this got stuck regularly. I have now implemented a tolerance scheme inspired from the zkdecrypto implementation, that accepts going down-hill if the score is not worse than a configurable percentile of the best known score in this iteration. Different than zkdecrypto, I implemented something like an auto-pilot for the tolerance value, each iteration counting how many down-hill acceptances where done, and the implementation adjusts the tolerance factor permanently to stay close to a configurable value. I noticed that there is a sweet spot around 1,5%, that I find ok as "magic number" in the hill-climber: I only distrust those in the scoring, but do not mind in the hill-climber.
Still, the algorithm may get stuck, so I implemented an option to reset fully after a maximum of iterations, so it is possible to run the program in the back-ground without looking after it.
Dave, the command line you use is correct. With the tolerance implementation, the commands are in C++
~/git/cDecryptor/cDecryptor -t 4 -r 0.1 -f 0.015 -x 250 -w ~/git/Languages/English/sample.txt -l ~/git/Languages/English/1-grams.txt -l ~/git/Languages/English/5-grams.txt -c ~/git/Ciphers/340.zodiac.txt
and in Java:
cd src; /opt/java/bin/javac -d ../bin -cp '../lib/*' ./jDecryptor/*.java ./jDecryptor/Move/*.java; cd .. /opt/java/bin/java -cp /home/fritz/git/jDecryptor/bin/:/home/fritz/git/jDecryptor/lib/commons-cli-1.4.jar:/home/fritz/git/jDecryptor/lib/commons-math3-3.6.1.jar -Xmx3096M -Xms3096M jDecryptor/Decryptor -r 0.1 -f 0.015 -x 250 -c /home/fritz/git/Ciphers/408.zodiac.txt -t 4 -m TXT -w /home/fritz/git/Languages/English/sample.txt -l /home/fritz/git/Languages/English/1-grams.txt -l /home/fritz/git/Languages/English/5-grams.txt
You may need to adjust the directory paths, obviously.
A brief explanation of the options:
-t: Number of threads -r: For the soft-reset, ratio of solution to reset when reaching local optimum -x: Number of iterations after which to hard-reset the solution candidate, starting fully from random -f: Target for the tolerance auto-pilot, a 0.015 means that roughly 1,5% of the tests should accept a down-hill decision. This permanently adds some jitter, that seems to help avoid getting stuck for long -d (c++), or -a (Java): Verbose logging, including the soft-reset start and end states -s (c++): Seed string, representing the solution as a string -s (Java): Filename to load for the seed, as output of the Java program. The last line is taken.
Both C++ and Java should solve the 408 within hours on house-hold type computers. This is an improvement over the code two weeks ago, which took days. Both c++ and Java will result in exactly the same scores for same solutions, if loaded with the same n-gram and sample files. I will one day align the options between c++ and Java implementations, to make them really fully the same.
If we look at the AZdecrypt scoring formula then I would say that there is no magic number involved: ngram_score_sum/ngrams*entropy.
There is a n-gram factor setting which is just there to get the scores within the 20k range which was a design choice and then there is a entropy factor setting which usually remains at 1 so it does nothing: entropy^1.
Here the entropy part works similarly to your inclusion of 1-grams, that is, to counterweight the higher n-grams tendency of only using/chaining ETAION heavy stuff like "ESORERETSATHREERT". I would say that with using entropy instead of 1-grams the solver demands less of how the plain text should look.
And congratulations on your improvements!
Just cheerleading a bit…KEEP DIGGING GANG! You guys that work on these ciphers might one day find something big!
There is more than one way to lose your life to a killer
http://www.zodiackillersite.com/
http://zodiackillersite.blogspot.com/
https://twitter.com/Morf13ZKS
If we look at the AZdecrypt scoring formula then I would say that there is no magic number involved: ngram_score_sum/ngrams*entropy.
There is a n-gram factor setting which is just there to get the scores within the 20k range which was a design choice and then there is a entropy factor setting which usually remains at 1 so it does nothing: entropy^1.
Here the entropy part works similarly to your inclusion of 1-grams, that is, to counterweight the higher n-grams tendency of only using/chaining ETAION heavy stuff like "ESORERETSATHREERT". I would say that with using entropy instead of 1-grams the solver demands less of how the plain text should look.
…
AZdecrypt works so good that I have no doubt its algorithms are effective. The below comments are more on the attempt to better understand why and how they do.
As far as I understand so far: AZdecrypto loads a binary version of the n-gram counts from http://practicalcryptography.com/crypta … languages/, which is the absolute count of each n-gram as an integer. The score is calculated in subroutine azdecrypt_234567810g in file AZdecrypto.bi. In the subroutine, the score is calculated in solver_ngram_main.bi first as a sum of the counts (omitting here the substraction of the previous value with its init in solver_ngram_init.bi), effectively:
new_ngram_score+=g2(sol(j),sol(j+1))
Entropy is calculated from a pre-computed table
for i=1 to l*ngram_size enttable(i)=abs(logbx(i/(l*ngram_size),2)*(i/(l*ngram_size))) next i
that is summarized
for i=0 to abc_sizem1 entropy+=enttable(frq(i)) next i
so that it matches the standard -p*ln(p) definition.
In the next step in solver_fastent.bi, there is the multiplication with the entropy
case 4:new_score=new_ngram_score*ngfal*entropy
where ngfal is pre-computed value but constant.
Then the score is effectively the sum of all ngram counts weighted by their natural frequency, and that then multiplied by entropy as value (without the comparison to what is expected from the language, like in zkdecrypto). I do not find yet where the division with ngrams is done.
Thinking of it more mathematically. Let me use p for the probabilities of the language n-grams, like 8% for an E. And let me use b for the probabilities in the solution candidate. Then the scoring function I understand is something of the form:
[Sum of (b*p)]*[Sum of b*ln(b)]
The first squared brackets are the ngrams sum, the second are the entropy. The bias I can imagine, is coming from two thoughts
-
1. The ngram count will favour a repetition of the very most popular ngrams. Only using one letter, it would want to put all to E
2. The entropy is a measure for how even the distribution is. In particular, it will not change its value if all E and X are exchanged in a text, because it does only depend on the frequencies as such
[/list:u:3hce3xbr]
Now, I construct a very boring language, only having the two letters A and B. Then the score of a solution candidate with relative b1 many A, and b2 many B, becomes (b1*p1+b2*p2)*(b1*ln(b1)+b2*ln(b2)), when the language corpus distribution of A and B is p1 and p2. In particular, (p1+p2)=1 and (b1+b2)=1, so the score can be simplified to (b1*p1+(1-b1)*(1-p1))*(b1*ln(b1)+(1-b1)*ln(1-b1)). The hill-climber will optmimize this function. Now I make the language even more boring: A has a likelihood of 100%, and B of 0%. The language is only one character, always. Then a solution candidate with b1 many A, and b2 many B, is b1*(b1*ln(b1)+(1-b1)*ln(1-b1)). The interesting part is now, this function does not have its optimum at b1=1, but at b1~=0,7.
I am sure that I either err or the effect is very weak. It might become visible when solving a cipher with multiplicity 1 (no symbol ever repeats), when the scoring function is left with no information at all and seeks for its own built-in sweet spot. When I tried with zkdecrypto (I have no Windows), it solved to only A, but it as well compares to the language entropy, and calculates a little different than AZdecrypto.
For these theoretical thoughts, and for the reason that even a 1 is a chosen weight, I try to use scoring based on Poisson and Normal distributions. These would punish an over-count of E, if it rises above the expected value. But despite all thoughts, I must add that my results are weaker than those of AZdecrypto, so that I rather want to evolve on my theory, instead of doubting AZdecrypto’s well proven results.
Just cheerleading a bit…KEEP DIGGING GANG! You guys that work on these ciphers might one day find something big!
Thanks morf13!
The below comments are more on the attempt to better understand why and how they do.
Well illustrated. You are spot on about everything. The division with the n-grams is pre-calculated in the ngfal variable:
al=l-(ngram_size-1) 'number of n-grams in cipher ngf=ngramfactor*ent_score_norm 'basically read this as ngf=ngramfactor since ent_score_norm is just a small fix to the score ngf/=1+((s/l)*multiplicityweight) 'factor in multiplicity weight if used ngfal=ngf/al 'divide by number of n-grams
For these theoretical thoughts, and for the reason that even a 1 is a chosen weight, I try to use scoring based on Poisson and Normal distributions. These would punish an over-count of E, if it rises above the expected value. But despite all thoughts, I must add that my results are weaker than those of AZdecrypto, so that I rather want to evolve on my theory, instead of doubting AZdecrypto’s well proven results.
You are on to something though. The idea of using n-gram sums + frequency control is that of glurk. It was (perhaps) never really questioned or thought about until you showed up. You have given me some ideas to work on.
For the records, applying the transposition to the original 340, and scoring n-grams with Poisson distribution and comparing each with what is expected from a Gaussian distribution from a sample text, to apply a more standard statistical reasoning to claim "it is the one and only solution beyond reasonable doubts", cDecryptor converges to (almost) the same solution. In sum all 1,2,3,4,5-grams are only ~1.3 sigma away from the theoretical, maximum possible score. None of the zkd/AZD code nor language data is re-used here, so that it is a fully independent confirmation, in terms of homophonic computational solver.
cDecryptor Version 12.12.2020 23:37 Reading cipher file /home/fritz/git/Ciphers/340.zodiac.txt Cipher: HER>pl^VPk|1LTG2dNp+B(#O%DWY.<*Kf)By:cM+UZGW()L#zHJSpp7^l8*V3pO++RK2_9M+ztjd|5FP+&4k/p8R^FlO-*dCkF>2D(#5+Kq%;2UcXGV.zL|(G2Jfj#O+_NYz+@L9d<M+b+ZR2FBcyA64K-zlUV+^J+Op7<FBy-U+R/5tE|DYBpbTMKO2<clRJ|*5T4M.+&BFz69Sy#+N|5FBc(;8RlGFN^f524b.cV4t++yBX1*:49CE>VUZ5-+|c.3zBK(Op^.fMqG2RcT+L16C<+FlWB|)L++)WCzWcPOSHT/()p|FkdW<7tB_YOB*-Cc>MDHNpkSzZO8A|K;+ Cipher length: 340 Reading transposition file: /home/fritz/git/cDecryptor/Transposition/340.zodiac.transposition.txt Applying transposition 0,19,38,57,76,95,114,133,152,1,20,39,58,77,96,115,134,136,2,21,40,59,78,97,116,135,137,3,22,41,60,79,98,117,119,138,4,23,42,61,80,99,118,120,139,5,24,43,62,81,100,102,121,140,6,25,44,63,82,101,103,122,141,7,26,45,64,83,85,104,123,142,8,27,46,65,84,86,105,124,143,9,28,47,66,68,87,106,125,144,10,29,48,67,69,88,107,126,145,11,30,49,51,70,89,108,127,146,12,31,50,52,71,90,109,128,147,13,32,34,53,72,91,110,129,148,14,33,35,54,73,92,111,130,149,15,17,36,55,74,93,112,131,150,16,18,37,56,75,94,113,132,151,153,172,191,210,229,247,267,286,305,154,173,192,211,230,248,268,287,289,155,174,193,212,231,249,269,288,290,156,175,194,213,232,250,270,272,291,157,176,195,214,233,251,271,273,292,158,177,196,215,234,252,255,274,293,159,178,197,216,235,253,256,275,294,160,179,198,217,236,238,257,276,295,161,180,199,218,237,239,258,277,296,162,181,200,219,221,240,259,278,297,163,182,201,220,222,254,260,279,298,183,202,204,223,241,261,280,299,184,203,205,224,242,262,281,300,185,187,206,225,243,263,282,301,186,188,207,226,244,264,283,302,170,189,208,227,245,265,284,303,171,190,209,228,246,266,285,304,164,165,166,167,168,169,309,308,307,306,310,311,312,313,315,314,317,316,318,319,320,321,324,323,322,326,325,334,333,332,331,330,329,328,327,335,336,337,338,339 After transposition: H+M8|CV@KEB+*5k.LdR(UVFFz9<>#Z3P>L(MpOGp+2|G+l%WO&D#2b^D(+4(5J+VW)+kp+fZPYLR/8KjRk.#K_Rq#2|<z29^%OF1*HSMF;+BLKJp+l2_cTfBpzOUNyG)y7t-cYA2N:^j*Xz6dpclddG+4-RR+4Ef|pz/JNb>M)+l5||.VqL+Ut*5cUGR)VE5FVZ2cW+|TB45|TC^D4ct-c+zJYM(+y.LW+B.;+B31cOp+8lXz6Ppb&RG+BCOTBzF1K<SMF6N*(+HK29^:OFTO<Sf4pl/Ucy59^W(+l#2C.B)7<FBy-dkF|W<7t_BOYB*-CM>cHD8OZzSkpNA|K;+ Randomize fraction: 0.2 Random re-initialization after 250 iterations Tolerance factor: 0.02 Parallel threads: 4 Reading norm file /home/fritz/git/Languages/English/1-grams.txt Reading norm file /home/fritz/git/Languages/English/2-grams.txt Reading norm file /home/fritz/git/Languages/English/3-grams.txt Reading norm file /home/fritz/git/Languages/English/4-grams.txt Reading norm file /home/fritz/git/Languages/English/5-grams.txt Analyzing sample text file: /home/fritz/git/Languages/English/sample.txt NGram length:5 NGrams:2303851 Samples:354005585 Mean:-3883.565483 StdDev:101.639654 Perfect: -5.540372 NGram length:4 NGrams:300309 Samples:354005586 Mean:-3324.995527 StdDev:74.520793 Perfect: -5.230017 NGram length:3 NGrams:17275 Samples:354005587 Mean:-2706.482463 StdDev:48.144425 Perfect: -4.793144 NGram length:2 NGrams:676 Samples:354005588 Mean:-2081.477924 StdDev:26.582133 Perfect: -4.199178 NGram length:1 NGrams:26 Samples:354005589 Mean:-1716.802735 StdDev:8.132748 Perfect: -3.014837 Best possible score: -22.777548 Tue Dec 15 12:03:20 2020 4 threads started. Tue Dec 15 12:03:20 2020 Thread: 1 Score: -3742.737596 1:-2040.085246 2:-3149.272427 3:-4855.284632 4:-6216.780387 5:-6663.954153 Tolerance: 0.020000 vwdjhwkolflwtbivjbhlxkddwseoiatsojldjyvjwlhvwxzpygoilhkolwelbtwkpqwijweaszjhtjlihivilhhjilhewlskzydytvjddgwljltjwxlhqpeljwyxipvqpelmqztliwkitrwybjqxbbvwemhhwefehjwttihodqwxbhhvkjjwxltbqxvhqkfbdkalqpwhplebhpwkoeqlmqwwtzdlwpvjpwlvgwltyqyjwjxrwysjhghvwlwyplwdylejddyitlwvllskwydpyejeejxtxqpbskplwxilwvlqeedlpmbidhpeelhlyzltmwdoqvojyawjijithlgw Tue Dec 15 12:03:20 2020 Thread: 3 Score: -3023.321987 1:-1963.479770 2:-3051.883639 3:-4662.270130 4:-6065.022462 5:-6660.073908 Tolerance: 0.020000 lnyuwwospzanmfekqromhommsjutmqvatqmyczccnswcniadzwzmspdzmnxmfvnodcnecncqakqoiupjoekmpsojmswussjdazmsmlnymgnaqpvcnissvkcacszhbjccjznkvkhsbjdjmnsnrcvirrcnxkoonxzcwcsivbptycnifwwkojqnhnmfvhcocozfmoqsvdnwkaxfwkwdzxvnkvnsvkymnjkqdnakgnavsvzcnuinsnacpwocnawzkasmspunymnbmmnlpsjdjzmkzuncxciihvjfjddmnimswkaczumajkremwduznsazkamkwytvlzuzqsnecbhwpgn Tue Dec 15 12:03:22 2020 Thread: 0 Score: -73.424526 1:-1747.312912 2:-2066.289433 3:-2752.115868 4:-3680.672273 5:-4691.680906 Tolerance: 0.021000 Clear: aeiosoienesetiatinsetitteotsateasieituhtersheronumbaranbeeleigeinceateatanisaondsatanystarsteronoutatarithesingterryeaasteuteshcsetoentreandtheinternnhelosseleasteageasicerisstitietttiethscieititrenesaslisaonbletoeeegnieestinestheseaeuteorheiatamshesouasetantritieteeanronautautraltratesionneerarotscettssonatsntetysunstooiseabouteratetsnhe Tue Dec 15 12:03:23 2020 Thread: 0 Score: -52.157229 1:-1736.623649 2:-2038.458955 3:-2709.355530 4:-3560.524284 5:-4528.057937 Tolerance: 0.019950 Clear: aeiosoannesetiatitslsasseatmateamilituhtersheranumbaronblelligeanceateatanisiondsatanistarsteranausetarishesingterrietasteusedhcditoendreandtherttertthelosseleasteigeomicerisstatiesttieshscaeisatrenestslistonbletoeeegniledtinestheseeeuteorheratomshesoutsesentrisretleanranaustutraltrisediannlerarotscitssdotassntitisunstooimeabouteratedsnhe Tue Dec 15 12:03:23 2020 Thread: 3 Score: -51.005434 1:-1741.227122 2:-2054.922861 3:-2681.803205 4:-3529.721329 5:-4515.274255 Tolerance: 0.019950 Clear: eriestofahermiatesoesonneailenatleeitattrsstreedariesprierceisrodtratrentheoreaboateanonessiesareandmelingreeastresntteeteaseattahosthiseerbmwedsttesstrcsoorchesterseplitreisstonersomitstotohinonstdrstecisttrictostreshieratedretgreadtatreewedttprotretateendailindemereasareantailecterstaiardereesttethineassansdihoneahemstilteieanelateisagr Tue Dec 15 12:03:25 2020 Thread: 3 Score: -46.771183 1:-1738.738931 2:-2057.704928 3:-2687.356514 4:-3518.401224 5:-4467.606038 Tolerance: 0.018952 Clear: eriestofahermiatesoesonneailenatleeitattrsstreedariesprierceisroderatrentheoreaboateanonessiesareandmelingreeastresntteeteaseateasouthiseerbmversttesstrcuoorchestersepliereisstonersomitstoeohinonstdrstecisttrictoutreshieratedretgreadtatreeverttprotretateendailinremereasareantailecterstaiardereestteesineausansdisoneahemutilteieanelateisagr Tue Dec 15 12:03:29 2020 Thread: 3 Score: -46.355589 1:-1742.223510 2:-2039.381497 3:-2649.703792 4:-3471.181491 5:-4436.908758 Tolerance: 0.018905 Clear: whieseofahehatanelesionneairandtresitatthsstheytareasureshcstphotthathentreemeaneanaanemassiesaryandawlingheeapthesntseeteaitattasortresternarealttellthcreehchestempturithetssnomehioattitetohtnonstthssectsserectortheprishanethengheddtatheereatturetheeaseendailinatashwasareansailectemitatartsheasenetsinearlanstisoneareareirtweeanelattesagh Tue Dec 15 12:03:29 2020 Thread: 2 Score: -44.559026 1:-1739.642679 2:-2051.956186 3:-2679.189725 4:-3498.419821 5:-4428.123520 Tolerance: 0.015476 Clear: ueeastinstneteritethrieeateresterthedlidessieratlchesuchhenhereitserdeaseittyasatriesitnesseastcalestureementsrdersioaandalronisnttooinsofcateasedoreeienottentasdayroureseressiintertteoritsiteeissotesannesatchnotooeariehenittenimentsoldeareaseductientlanaessereesotheusstcflealerandryronetctherestinsteennoerestettinlintoterouhalsarrdonssme Tue Dec 15 12:03:31 2020 Thread: 3 Score: -35.888636 1:-1733.883051 2:-2030.409311 3:-2632.563605 4:-3401.879508 5:-4273.241373 Tolerance: 0.017960 Clear: ihiseeoforehasanolesionneairandcrositatthsetheetareasureshissphotthathencroemsobeanaonewaseiesareansaidincheoopthesntreeteaitattafortresterbarealttellthireehireetempturitheseenowohioastitetorsnonstthereiserereitortheprishanothenchedstathsereacturetheeareensoidinatashiosareanraideitemitasartsheasenetfinearlanetifoneareareirtiesanedatteeoch Tue Dec 15 12:03:33 2020 Thread: 3 Score: -34.110940 1:-1734.863067 2:-2045.927537 3:-2642.724415 4:-3400.574381 5:-4252.341498 Tolerance: 0.017062 Clear: iheseyouonehasanoletsonneairandcrotetatthsetheetareasurethitsphotthathencroemsomeanaonewaseiesareansaidencheoopthesntreeteastattafortrestermarealttellthireehineetempturetheseenowohsoaststetonsnonstthereiseryreitortheprethanothenchedstathsereacturetheyareensoidenatathiosareanraideitemstasarttheasynetfinearlanetifonearearyertiesanedatteeoch Tue Dec 15 12:03:34 2020 Thread: 3 Score: -34.029437 1:-1732.974433 2:-2045.411713 3:-2646.651575 4:-3408.813919 5:-4262.497729 Tolerance: 0.016209 Clear: iheseyouonehasanolessonneairandcrosetatthsetheetareasureshissphotthathencroemsobeanaonewaseiesareansaidencheoopthesntreeteastattafortresterbarealttellthireehineetempturetheseenowohsoaststetonsnonstthereiseryreitorthepreshanothenchedstathsereacturetheyareensoidenatashiosareanraideitemstasartsheasynetfinearlanetifonearearyertiesanedatteeoch Tue Dec 15 12:03:38 2020 Thread: 0 Score: -33.475679 1:-1728.051775 2:-2034.037998 3:-2628.967909 4:-3384.485558 5:-4250.997659 Tolerance: 0.013235 Clear: ihiseeofarehasinolotsonneainurednotitatthsetheytageusonethatschotthitherdroomsagoinualowuseiesanyansaidintheoacthesltreeteassattaportressingarealttelltharoohareetemcsonitheseenowohsoaststotorsnorstthereasereneatorthecrithanothentheestathsereadtogotheeareensaidinasathiasanianraideatemstasanttheusenetpinearlinetipoleareareintiesareditseeath Tue Dec 15 12:03:40 2020 Thread: 3 Score: -32.669044 1:-1729.218706 2:-2037.197029 3:-2638.316462 4:-3389.905238 5:-4234.391493 Tolerance: 0.015360 Clear: ihesetotapehasandletionneairandardtetotthsethemtoreasurethatsyhotthathpnardersageanaanewaseiesarmonsaidenchedaythesntspeteoitittimoftrestergaleslttellthafeehappeteryturetheseenowdhioastitetopsnonsttheseasestreatoftheyrethindthenchedstothselesaturethetoseensaidenstathiasareonsoidpateritisarttheastnetmineiflanetimoneoreaftertiesonedatteeach Tue Dec 15 12:03:41 2020 Thread: 3 Score: -31.535408 1:-1728.150491 2:-2039.363783 3:-2637.740824 4:-3384.693677 5:-4211.002528 Tolerance: 0.014592 Clear: ihesetocarehasandletionneairandardtetotthsethemtoreasurethatsyhotthathpnandersageanaanewaseiesarmonsaidenchedaythesntspeteoisittimoftnessergaredlttellthafeeharpeterysuretheseenowdhioastitetorsnonsttheseasestreatoftheynethindthenchedstothseredaturethetoseensaidendsathiasareonsoidpateritisarttheastnetmineiflanetimoneoneaftertiesonedatseeach Tue Dec 15 12:03:43 2020 Thread: 3 Score: -31.424866 1:-1728.526477 2:-2040.578297 3:-2638.422546 4:-3383.617028 5:-4208.685204 Tolerance: 0.013862 Clear: ihesetowarehasandletionneairandardtetotthsethemtoreasurethatsyhotthathpnandersageanaanewaseiesarmonsaidenchedaythesntspeteoisittimoftnessergaredlttellthafeeharpeterysuretheseenowdhioastitetorsnonsttheseasestreatoftheynethindthenchedstothseredaturethetoseensaidendsathiasareonsoidpateritisarttheastnetmineiflanetimoneoneaftertiesonedatseeach Tue Dec 15 12:04:20 2020 Thread: 0 Score: -31.302090 1:-1729.530229 2:-2048.550035 3:-2639.600821 4:-3386.839384 5:-4212.082847 Tolerance: 0.007045 Clear: ihesetowarehasandletionnearrandardtetotthsethestoreasurethatsyhotthathenandersageanaanelaseresarsonsaidenchedaythesntseeteoipittiumftnrspergaveslttellthafeehareeterypuretheseenoldhimastitetorsnonsttheseasestreatmftheynethindthenchedstothsevesaturethetoseensardenspathiasareonsordeateritisarttheastneturneiflanetrumneoneaftertiesonedatpreach Tue Dec 15 12:04:22 2020 Thread: 3 Score: -31.266461 1:-1726.438612 2:-2047.896485 3:-2643.871871 4:-3386.416704 5:-4229.849629 Tolerance: 0.006693 Clear: mholetofivehisandletconnegoranyardtotatthsetheetareasurethatsshotthathenanderlimeanairebaseoesgreandimponthedisthesreseeteactittiwasenestermiteslteellthaseehaveetersturotheseenobdhcaisectetovsnonsetheseasestreaeasehesnothindthentheydeathletesaturethetaseendioponstithmisgreansaopeaterceisgrttheastnetwoneislanetowareaneistoremelanepatteeith Tue Dec 15 12:04:22 2020 Thread: 0 Score: -31.259953 1:-1728.439858 2:-2048.419888 3:-2637.674700 4:-3387.757183 5:-4214.774362 Tolerance: 0.006693 Clear: ihesetowarehasandletionnearrandardtetotthsethestoreasurethatsyhotthathenandersageanaanelaseresarsonsaidenchedaythesntseeteoipittiumftnospergaveslttellthafeehareeterypuretheseenoldhimastitetorsnonsttheseasestreatmftheynethindthenchedstothsevesaturethetoseensardenspathiasareonsordeateritisarttheastneturneiflanetrumneoneaftertiesonedatpoeach Tue Dec 15 12:04:24 2020 Thread: 3 Score: -30.940463 1:-1725.727338 2:-2050.970779 3:-2642.581591 4:-3380.977663 5:-4227.919991 Tolerance: 0.006358 Clear: mholetogivehisandletconnegonaryandtotatthsetheetareasurethatsshotthatheranderlifeanairebaseoesgreandimponthedisthesreseeteactittiwasenesterfiteslteellthaseehaveeterstunotheseenobdhcaisectetovsnorsetheseasestreaeasehesnothindthentheydeathletesaturethetaseendioponstithmisgreansaopeaterceisgrttheastnetwoneislanetowareaneistonemelarepatteeith Tue Dec 15 12:04:25 2020 Thread: 3 Score: -30.832232 1:-1725.139669 2:-2049.215798 3:-2642.263064 4:-3378.695010 5:-4225.035191 Tolerance: 0.006040 Clear: mholetogivehisandletconnegonaryandtotatthsetheetareasurethatsshotthatheranderlifeanairebaseoesgreandimponthedisthesreseeteactittiwasenesterfiveslteellthaseehaveeterstunotheseenobdhcaisectetovsnorsetheseasestreaeasehesnothindthentheydeathlevesaturethetaseendioponstithmisgreansaopeaterceisgrttheastnetwoneislanetowareaneistonemelarepatteeith Tue Dec 15 12:04:31 2020 Thread: 0 Score: -30.544209 1:-1726.758403 2:-2050.932246 3:-2634.721709 4:-3368.047559 5:-4207.521917 Tolerance: 0.006327 Clear: mholetogivehisandletconneasnarrandtotatthrethestageararethatsshotthatheranderlifeanairewareserarsandimponyhedistherreseeteactittitusenorterfiredlteellthaseehaveeterstanotheseenowdhcuisectetovsnorretheseasestreaeusehesnothindthenyherdeathleredatagethetaseendispondtithmirareansaspeaterceisarttheartnettsneislanetstureaneistonemelarepattoeiyh Tue Dec 15 12:04:33 2020 Thread: 0 Score: -30.357758 1:-1726.690951 2:-2051.380930 3:-2646.489408 4:-3384.866104 5:-4208.207440 Tolerance: 0.006010 Clear: mholetogivehisandletconneasnarrandtotasthreshestageararethatsshotthatheranderlifeanairewareserarsandimponyhedistherreseeteactistitusenorterfitedlteellshaseehaveeterstanotheseenowdhcuisecsetovsnorretheseasestreaeusehesnothindthenyherdeathletedatageshetaseendispondtithmirareansaspeaterceisarttheartnettsneislanetstureaneistonemelarepattoeiyh Tue Dec 15 12:04:43 2020 Thread: 3 Score: -29.666076 1:-1729.397939 2:-2059.804959 3:-2639.599630 4:-3362.471502 5:-4180.433612 Tolerance: 0.005397 Clear: mholecolifehisandsetsonneesnotlandtotatthrethestageorarethatsshotthathetanderlimeanoireforeserersandimponyhedistherreseeteastattaburenortermitedsteessthareehafeeterstanotheseenofdhsuisestetofsnotretheseasescreaeurehesnothandthenyheldeathletedatagethecaseendispondtithmirereansaspeaterseaserttheorcnetbsnearsanetsbureaneirconemelatepattoeiyh Tue Dec 15 12:04:45 2020 Thread: 3 Score: -29.602736 1:-1725.616554 2:-2059.273222 3:-2645.489321 4:-3373.375186 5:-4200.962084 Tolerance: 0.005667 Clear: mholecolivehisandsetconneesnotlandtotatthrethestageorarethatsshotthathetanderlimeanoireforeserersandimponyhedistherreseeteactattaburenortermitedsteessthareehaveeterstanotheseenofdhcuisectetovsnotretheseasescreaeurehesnothandthenyheldeathletedatagethecaseendispondtithmirereansaspeaterceaserttheorcnetbsnearsanetsbureaneirconemelatepattoeiyh Tue Dec 15 12:04:47 2020 Thread: 3 Score: -29.596529 1:-1726.818897 2:-2060.361615 3:-2645.445077 4:-3368.768399 5:-4197.462324 Tolerance: 0.005950 Clear: mholecolivehisandsetconneesnotlandtotatthrethestageorarethatsshotthathetanderlimeanoireforeserersandimponyhedistherreseeteactattabusenortermitedsteessthaseehaveeterstanotheseenofdhcuisectetovsnotretheseasescreaeusehesnothandthenyheldeathletedatagethecaseendispondtithmirereansaspeaterceaserttheorcnetbsneassanetsbureaneisconemelatepattoeiyh Tue Dec 15 12:05:01 2020 Thread: 1 Score: -28.883319 1:-1725.216431 2:-2057.779634 3:-2659.765033 4:-3372.918851 5:-4188.337122 Tolerance: 0.003572 Clear: mholebodimehosandsetsonneasnotfundtotatthretheytakeoracethatsshotthathltunderliveanoiceforeseracyandomponthedisthercesleteastattainrenorticvotedsteessthareehamleterstanotheseenofdhsnosestetomsnotretheseasesbceaenrehesnothandthenthefdeathletedutakethebaseendispondtothmiraciansasplaterseasacttheorbnetisnearsanetsinceaneorbonemelatepattoeith Tue Dec 15 12:05:03 2020 Thread: 1 Score: -28.865406 1:-1725.952337 2:-2057.497391 3:-2649.574354 4:-3358.820538 5:-4179.770739 Tolerance: 0.003750 Clear: mholebodirehosandsetsonneasnotfundtotatthretheytakeoracethatsshotthathltunderliveanoiceforeseracyandomponthedisthercesleteastattainrenorticvotedsteessthareeharleterstanotheseenofdhsnosestetorsnotretheseasesbceaenrehesnothandthenthefdeathletedutakethebaseendispondtothmiraciansasplaterseasacttheorbnetisnearsanetsinceaneorbonemelatepattoeith Tue Dec 15 12:05:05 2020 Thread: 1 Score: -28.766098 1:-1724.176265 2:-2056.366830 3:-2652.576885 4:-3366.572203 5:-4183.321224 Tolerance: 0.003938 Clear: mholebowirehosandsetsonneasnotfundtotatthretheytakeoracethatsshotthathltunderliveanoiceforeseracyandomponthedisthercesleteastattainrenorticvotedsteessthareeharleterstanotheseenofdhsnosestetorsnotretheseasesbceaenrehesnothandthenthefdeathletedutakethebaseendispondtothmiraciansasplaterseasacttheorbnetisnearsanetsinceaneorbonemelatepattoeith Tue Dec 15 12:05:17 2020 Thread: 1 Score: -28.739654 1:-1724.939262 2:-2048.988166 3:-2642.908843 4:-3363.522581 5:-4160.399973 Tolerance: 0.003713 Clear: ehopegowirehasandistconnelsnormandtotatthretheytairorderthatsshotthatherandsupilsanoilsforeserleyandaemonthedistherleseeteaceattailsendrevelatediteeiithassshareeteusednotheseenofdhclasectstorsnorretheseasesgeraelsehesnothandthenthemdeathpetedatdisthegaseendismondeatheirlevansasmeateuceaslettheorgnetisneasianetsilleaneasgoneerparematedeith Tue Dec 15 12:05:19 2020 Thread: 1 Score: -28.723310 1:-1724.757720 2:-2051.988732 3:-2647.904735 4:-3367.656235 5:-4169.170735 Tolerance: 0.003898 Clear: ehopegowirehasandistconnelsnormandtotatthretheytairorderthatsshotthatherandsupilsanoilsforeserleyandaemonthedistherleteeteaceattailsendrevelatediteeiithassshareeteusednotheseenofdhclasectstorsnorretheteasetgeraelsehesnothandthenthemdeathpetedatdisthegateendismondeatheirlevantasmeateuceaslettheorgnetisneasianetsilleaneasgoneerparematedeith Tue Dec 15 12:05:37 2020 Thread: 1 Score: -28.690782 1:-1724.202591 2:-2049.640843 3:-2629.741592 4:-3345.019678 5:-4151.539244 Tolerance: 0.004703 Clear: ehopetofirehasandistconnelsnormandtotatthretheywairorderthatsshowthatherandsupilsanoilsforeserleyandaemonthedistherleteeteaceattailsendregelateriteeiithassshareeteusednotheseenofdhclasectstorsnorrewheteasetteraelsehesnothandwhenthemdeathpeteratdisthetateendismonreatheirlegantasmeateuceaslewtheortnetisneasianewsilleaneastoneerparematedeith Tue Dec 15 12:05:38 2020 Thread: 1 Score: -28.484399 1:-1723.771825 2:-2050.027758 3:-2634.395486 4:-3347.656714 5:-4151.423958 Tolerance: 0.004938 Clear: ehopetofirehasandistconnelsnorgandtotatthretheywairorderthatsshowthatherandsupilsanoilsforeserleyandaemonthedistherleteeteaceattailsendrevelatediteeiithassshareeteusednotheseenofdhclasectstorsnorrewheteasetteraelsehesnothandwhenthegdeathpetedatdisthetateendismondeatheirlevantasmeateuceaslewtheortnetisneasianewsilleaneastoneerparematedeith Tue Dec 15 12:06:08 2020 Thread: 1 Score: -28.228577 1:-1726.333696 2:-2056.799999 3:-2643.960418 4:-3350.627000 5:-4149.815515 Tolerance: 0.004160 Clear: eholesofirehasandistconnelsnotsandtotatthretherwayforsofthetsshowthathetandsrlipsanoilsforeserlorandaemonthedistherleteeteaceattaubrendrecopatediteeiithersshereetersesnotheseenofdhcbasectstorsnotrewheteesetsofeebrehesnothandwhenthesdeathletedatsysthesateendismondeatheirlocantasmeeterceaslowtheorsnetusnearianewsubleanearsoneeflatematedeith Tue Dec 15 12:06:10 2020 Thread: 1 Score: -28.176003 1:-1724.197559 2:-2059.034193 3:-2650.170906 4:-3357.618382 5:-4165.559772 Tolerance: 0.004368 Clear: eholesofirehasandistconnelsnotmandtotatthretheywayforsofthetsshowthathetandsalipsanoilsforeserloyandaemonthedistherleteeteaceattaubrendrecopatediteeiithersshereeteasesnotheseenofdhcbasectstorsnotrewheteesetsofeebrehesnothandwhenthemdeathletedatsysthesateendismondeatheirlocantasmeeteaceaslowtheorsnetusnearianewsubleanearsoneeflatematedeith Tue Dec 15 12:06:12 2020 Thread: 1 Score: -28.155502 1:-1723.879558 2:-2056.860897 3:-2649.004807 4:-3358.951103 5:-4162.069100 Tolerance: 0.004587 Clear: eholesofirehasandistconnelsnotsandtotatthrethegwayforsofthetsshowthathetandsalipsanoilsforeserlogandaemonthedistherleteeteaceattaubrendrecopatediteeiithersshereeteasesnotheseenofdhcbasectstorsnotrewheteesetsofeebrehesnothandwhenthesdeathletedatsysthesateendismondeatheirlocantasmeeteaceaslowtheorsnetusnearianewsubleanearsoneeflatematedeith Tue Dec 15 12:06:34 2020 Thread: 0 Score: -27.950095 1:-1723.237445 2:-2045.473079 3:-2638.383540 4:-3340.724730 5:-4129.171912 Tolerance: 0.004519 Clear: ihopeyouarehasanesstconnelinormanetotatthrethemwasforeofthatsshowthatheratesapausanoabstoreierlomansaidontheeastherbereeteacrittillneterryouatedsteessthansshareeteasrenotheseenotehclasectstorsnorrewhereaseryofaelnehestothinewhenthemseathpetedatesstheyareensaidondrathiarloyanraideateaceislowtheorynetlineinsanewillbeateanyoneifparedatreeath Tue Dec 15 12:06:36 2020 Thread: 0 Score: -27.766163 1:-1722.614187 2:-2057.978830 3:-2656.746951 4:-3367.821826 5:-4157.948654 Tolerance: 0.004744 Clear: ihopeyouarehasanesstconnelinormanetotatthrethemwasforeofthatsshowthathgratesapausanoabstoreierlomansaidontheeastherbergeteacrattallneterryouatedsteessthansshargeteasrenotheseenotehclasectstorsnorrewhereaseryofaelnehestothanewhenthemseathpetedatesstheyareensaidondrathiarloyanraidgateaceaslowtheorynetlineansanewillbeateanyoneifparedatreeath Tue Dec 15 12:06:37 2020 Thread: 0 Score: -27.220263 1:-1724.216582 2:-2060.247720 3:-2658.167816 4:-3358.986349 5:-4136.674923 Tolerance: 0.004507 Clear: ihopeyouarehasanesstconnelinormanetotatthrethemwasforeofthatsshowthathgratesapausanoabstoreierlomansaidontheeastherbergeteaceattallnethreyouatedsteessthansshargeteaseenotheseenotehclasectstorsnorrewhereaseryofaelnehestothanewhenthemseathpetedatesstheyareensaidondeathiarloyanraidgateaceaslowtheorynetlineansanewillbeateanyoneifparedateheath Tue Dec 15 12:09:01 2020 Thread: 1 Score: -26.950138 1:-1721.531403 2:-2055.149637 3:-2644.104790 4:-3347.438235 5:-4116.600819 Tolerance: 0.003321 Clear: ihopeyouarehasandsstconnelinormandtotatthretheywasforeofthatsshowthathgrandsaparsanoabsforeierloyandailonthedastherbergeteaceattallrendrecoratedsteesstharsshargeteaseenotheseenofdhclasectstorsnorrewhereaseryofaelrehesnothandwhenthemdeathpetedatesstheyareendailondeathiarlocanrailgateaceaslowtheorynetlinearsanewillbeanearyoneifparelatedeath Tue Dec 15 12:09:38 2020 Thread: 2 Score: -26.803696 1:-1718.796330 2:-2054.015867 3:-2655.326015 4:-3369.414610 5:-4120.640407 Tolerance: 0.003957 Clear: ihopeyouarehasandsitconnelinorgandtotatthretheywasforeofthatsshowthathgrandinparianoabivoreierloyanrailonthedastherbergeteaceattallrendrecoratedsteessthariihargetenseenotheseenovdhclasectitorsnorrewhereaseryofaelrehesnothandwhenthegreathpetedatesitheyareenrailondeathiarlocanrailgatenceaslowtheorynetlinearsanewillbeanearyoneifparelatedeath Tue Dec 15 12:10:15 2020 Thread: 0 Score: -26.710201 1:-1723.892342 2:-2059.977268 3:-2647.764113 4:-3352.243201 5:-4106.941057 Tolerance: 0.003317 Clear: ihopeyouarehasandsetsonnelinorsandtotatthrethecwasforaofthatsshowthathgrandedpareanoabeforeierlocanmailonthedastherbergeteaseattallrendrecoratedsteessthareehargetedseanotheseenofdhslasestetorsnorrewhereaseryofaelrehesnothandwhenthesmeathpetedatasetheyareenmailondeathiarlocanrailgatedseaslowtheorynetlinearsanewillbeanearyoneifparelatedeath Tue Dec 15 12:10:56 2020 Thread: 3 Score: -26.441904 1:-1721.925608 2:-2059.865233 3:-2651.830293 4:-3355.293498 5:-4106.639854 Tolerance: 0.004346 Clear: ihopeyouarehasandsetconnelinorgandtotatthrethemwasforaofthatsshowthathgrandedpareanoabeforeierlomanrailonthedastherbergeteaceattallsendrecoratedsteessthaseehargetedseanotheseenofdhclasectetorsnorrewhereaseryofaelsehesnothandwhenthegreathpetedatasetheyareenrailondeathiarlocanrailgatedceaslowtheorynetlineassanewillbeaneasyoneifparelatedeath Tue Dec 15 12:11:20 2020 Thread: 1 Score: -26.344174 1:-1723.706875 2:-2060.160118 3:-2656.256942 4:-3355.515064 5:-4099.496088 Tolerance: 0.004271 Clear: ihopeyouarehasandsetsoffeminorgandtotatthrethenwasnorainthatsshowthathgrandedpaceanoabeforeierminafrailofthedastherbergeteaseattallsendrenicatedsteessthaseehargetedseanotheseenofdhslasestetorsforrewhereaseryinaelsehesnothandwhenthegreathpetedatasetheyareefrailofdeathiarminafrailgatedseasmiwtheorynetlifeassafewillbeaneasyoneinparelatedeath Tue Dec 15 12:11:22 2020 Thread: 1 Score: -26.337019 1:-1723.632038 2:-2058.891861 3:-2653.213290 4:-3350.275326 5:-4095.614117 Tolerance: 0.004484 Clear: ihopeyouarehasandsetsoffeminorgandtotatthrethenwasnoreinthatsshowthathgrandedpaceanoabeforeierminafrailofthedastherbergeteaseattallsendrenicatedsteessthaseehargetedseenotheseenofdhslasestetorsforrewhereaseryinaelsehesnothandwhenthegreathpetedatesetheyareefrailofdeathiarminafrailgatedseasmiwtheorynetlifeassafewillbeaneasyoneinparelatedeath Tue Dec 15 12:20:05 2020 Thread: 3 Score: -26.336957 1:-1726.647388 2:-2064.977263 3:-2660.339343 4:-3363.407285 5:-4088.966063 Tolerance: 0.004180 Clear: ihopesofarehasongsetsoffeninaryingtotatthmetherwasnameonthetsshowwhothpringerpaceonaabeuameiemnorafrainofthegasthemberpeteaseitwillsendmetocatedsteesstheseeherpeterseenowheseenoughslasesteworsformewhereesersoneelsehesnothingwhentheyreathpeteditesethesareefrainofdeathiamnotafrainpeterseisnowtheamsnewlifeissofewillbeaneassoneinparenotedeath Tue Dec 15 12:20:07 2020 Thread: 3 Score: -26.296175 1:-1724.923284 2:-2065.216820 3:-2661.174242 4:-3367.300902 5:-4098.169764 Tolerance: 0.004389 Clear: ihopesocarehasongsetsoffeninaryingtotatthmetherwasnameonthetsshowwhothpringerpaceonaabeuameiemnorafrainofthegasthemberpeteaseitwillsendmetocatedsteesstheseeherpeterseenowheseenoughslasesteworsformewhereesersoneelsehesnothingwhentheyreathpeteditesethesareefrainofdeathiamnotafrainpeterseisnowtheamsnewlifeissofewillbeaneassoneinparenotedeath Tue Dec 15 12:20:20 2020 Thread: 2 Score: -24.668201 1:-1713.367417 2:-2067.963707 3:-2691.270360 4:-3375.012351 5:-4053.461740 Tolerance: 0.003744 Clear: ihopesofarehasinglotsoffanintryingtocatchmethatwasntmeonthetsshowwhichbringoupapointaboutmeiamnotafrainofthegaschamberbecaaseitwillsendmetoparanlceallthesooherbecauseenowhaseenoughslasestoworsformewhereesersoneelsehasnothingwhentheyreachparanicesothesareafrainofneathiamnotafrainbecauseisnowthatmsnewlifeislifewillbeaneassoneinparanicedeath Tue Dec 15 12:20:22 2020 Thread: 2 Score: -24.190603 1:-1717.001652 2:-2067.998939 3:-2679.104107 4:-3359.350623 5:-4028.467854 Tolerance: 0.003557 Clear: ihopesofarehasinglotsoffanintryingtocatchmethatwasntmeonthetsshowwhichbringoupapointaboutmeiamnotafrainofthegaschamberbecaaseitwillsentmetoparallceallthesooherbecauseenowhaseenoughslasestoworsformewhereesersoneelsehasnothingwhentheyreachparalicesothesareafrainofleathiamnotafrainbecauseisnowthatmsnewlifeislifewillbeaneassoneinparaniceteath Tue Dec 15 12:21:57 2020 Thread: 0 Score: -24.078170 1:-1715.381242 2:-2069.442228 3:-2691.363772 4:-3370.837110 5:-4023.247484 Tolerance: 0.004489 Clear: ihopeyouarehatinglotsoffanintryingtocatchmethatwasntmeonthettshowwhichbringoupapointaboutmeiamnotafrainofthegaschamberbecaaseitwillsentmetoparallceallthesooherbecauseenowhateenoughslatestowortformewhereeteryoneelsehasnothingwhentheyreachparalicesotheyareafrainofleathiamnotafrainbecauseitnowthatmynewlifeislifewillbeaneasyoneinparaniceteath
Thanks for confirming that, f.reichmann!
For the records, applying the transposition to the original 340, and scoring n-grams with Poisson distribution and comparing each with what is expected from a Gaussian distribution from a sample text, to apply a more standard statistical reasoning to claim "it is the one and only solution beyond reasonable doubts", cDecryptor converges to (almost) the same solution. In sum all 1,2,3,4,5-grams are only ~1.3 sigma away from the theoretical, maximum possible score. None of the zkd/AZD code nor language data is re-used here, so that it is a fully independent confirmation, in terms of homophonic computational solver.
Hi, just made me curious so I hopefully may throw in some questions:
– "scoring n-grams with Poisson distribution and comparing each with what is expected from a Gaussian distribution from a sample text"
Could you please specify, the scoring of n-grams happens based on what, homophones? Or the ‘given’ cleartext? thx. [any readable cleartext would come up with some gaussion n-gram distribution, wouldn’t it??]
– "to apply a more standard statistical reasoning to claim "it is the one and only solution beyond reasonable doubts"
This statement is based on what?
– cDecryptor converges to (almost) the same solution
is there any open source code to look at what cDecryptor actually does (‘converging’)? Was the self-proclaimed cleartext actually created with this program?
– In sum all 1,2,3,4,5-grams are only ~1.3 sigma away from the theoretical, maximum possible score
The cipher has no 4-grams nor 5-grams, thus, which ones are you exactly referring to? Could you provide the calculation leading to your ‘1.3 sigma maximum possible score’ – thank you.
Thank you, trying to follow your ideas (as it shall be confirming / for the records – it should be solid then..).
QT
*ZODIACHRONOLOGY*
Hello,
I would like a few enlightenment here.
I am following the same path as f.reichmann, but in python (I am a python enthusiast).
I followed the ideas of Amrapali Dhavare (homophones attack) herself following the ideas of Jakobsen (substitution attack) – links to find on the net.
Current achievement : Z408 solved quickly if "key allocation" is provided (I mean how many codes for a, how many for b etc…)
Hill climbing so called "outer hill climibing" on number of allocations appears to be too slow (even after applying multi processing).
I have been examining C++ code from cDecryptor but could not get it.
Questions : (:-))
1) What is the change done ? Is is changing the plain associated to a cipher code doing a "unique change" ? Just this small change ? Something else not as simple ? What does AZDecrypt ? What does cDecryptor ? (I perform a swap)
2) What is the evaluation function ? "anticipating poisson" "ngram too often -> likelihood dropped" : can you explain ? What does AZDecrypt ? What does cDecryptor ?
For cDecryptor I presume you have a way of measuring the limit of the sum of ngram frequencies that you must not get higher than. And punish if you do . What is done ? An abs(reached – reference) ? Something resulting from the initial frequency table ? What is this poisson formula ?
(What I do is sum of log10 of ngrams frequencies. Try to get it as high as possible. This is efficient when swapping. But not if I change to "unique change" strategy I suppose – as I noticed and understood from the dialog on this post)
3) From my little experience I confirm that going down a bit (after failing to go up) does improve a lot. I used "simulated annealing" and the exponentiation stuff. – same idea as expressed by Jarlve although he did not mention SA.
I can feel I am not very far but I am lacking a bit of the mathematics here.
Regards
Hello,
I want to try to clarify the idea in cDecryptor, and the statistical idea behind it. The tasks are to describe features of natural language, from these features design a comparable score that is tolerant to errors like wrongly assigned symbols or spelling mistakes in the clear text, and run this to test against a large number of ideas on how the text might have been encrypted.
I describe my ideas in cDecryptor, and sketch my understanding of zkdecrypt, AZDecrypt in comparison:
1. Step: Describe features of natural language
1.1. Count the n-grams of a natural corpus. here, an n-gram is a sequence of n characters that appears in text dropping the spaces. E.g. "the street" would be concatenated to "thestreet", and then is would have 1-grams e(3), t(3), h(1), s(1), r(1), 2-grams th(1), he(1), es(1), tr(1), re(1), ee(1), et(1), 3-grams the(1), hes(1), est(1), …. Imagine doing this for a very large text corpus, then there would be a pattern of distribution how often 1-grams, 2-grams, 3-grams etc occur. That basic idea is the same in zkdecrypto, AZDecrypt, cDecrypt, and any other code I investigated so far.
1.2. Additional features such as information entropy as defined by Shannon ( https://en.wikipedia.org/wiki/Entropy_( … ion_theory)), or the index of coincidence ( https://en.wikipedia.org/wiki/Index_of_coincidence ). Both of which describe of the (1-)gram statistical distribution a text, you may look at them as being similar to other statistical features like mean, or variance.
2. Step: Assign a value to all these features
2.1. The cDecryptor idea is based on the observation that the n-grams likelihoods match the criteria of expecting a Poisson distribution ( https://en.wikipedia.org/wiki/Poisson_distribution ), that is essentially a high n, and a low p. In particular true for n-grams longer than only one letter, and not so much true for the frequent 1-grams like e and t. Hence, I postulate that the n-gram distribution roughly follows an n-gram distribution. This allows to judge if a n-gram appears too seldom, too often, or just about right. Combining all n-gram counts is easy to solve, just multiply each likelihood coming from the Poisson distribution, and to make it numerically more stable in the computer code, add the logarithms instead. Using entropy or IoC I considered not necessary, because the n-grams distributions are already described in deeper detail by Poisson.
2.2. The zkdecrypto and AZDecrypt idea is that we may expect more frequent n-grams to appear more often in the solution candidate. The fundamental concept hence is to add up the frequencies (more precisely, they as well use the logarithms to achieve numerical stability).
3. Step: Combine all the values of all these features into a comparable number
Having several n-gram counts, and in case of zkdecrypto and AZDecrypt additonally entropy or IoC, exposes the code to over-adaptation into local optima. E.g. for zkdecrypto and AZDecrypt, only adding the most frequent n-grams would give highest scores to meaningless solutions, e.g. "e" and "t" and "h" are very frequent, as are "th" and "he", as is "the". The algorithm so far would now give high scores to a repetitive pattern of "thethethethe…", which obviously is no meaningful text. Something similar is true for cDecryptor: Adding the logarithms of 1-gram, 2-gram, …., n-gram counts can over-adapt on 1-grams as it converges easily with little "cross-word puzzle" effect, and prevent finding better solutions on the e.g. 5-gram count.
3.1. cDecryptor: Each feature has itself a statistical distribution. In particular, what deviations from the expected n-gram counts are normal? For that, a sample text is scored at a length of 340 characters, and each time the n-grams score from the Poisson distribution (aka: its likelihood) is calculated. That allows computing means and standard deviations, and therefore specifying a likelihood that this text is normal text. To respond to Quicktrader question on why this is statistically cleaner: It matches the concept of taking the most likely solution that is consistent with the side constraint as the correct answer.
3.2. zkdecrypto and AZDecrypt: They both do combine the different values in a weighted multiplication or addition. The weights look heuristic and engineered to me, and without further explanation, are justified only by the observed fact that they work extremely well. These heuristic formulas with their lack of theoretical explainability was my motivation to write cDecryptor, speculating that these formula are over-optimized for the tested set of ciphers, in particular the z408.
4. Step: Compare only partially correct deciphered solution candidate a score and tell if it is better than another candidate, and optimize the solution
In all zkdecryto, AZDecrypt and cDecryptor, this is done by hill-climbing with simulated annealing and testing many candidates. In particular the AZDecrypt hill-climber appears strong to me, as well the optimization for speed to minimize computational effort.
Some direct responses:
Palpatine:
1. A change is in each case the smallest change possible, and that is to assign one cipher symbol a different letter. This tends to get stuck quickly, and here simulated annealing (give worse solutions a controlled likelihood of acceptance), as well as a partial reset with randomly changing a larger number of symbols at once, help escaping the local optimum.
2. Adding log10 or log2 or natural logarithm does not matter. Using logarithm is important, to keep numerical stability with converting multiplication of tiny numbers into a sum of tiny numbers.
3. I got the hints for simmulated annealing from both David and Jarl. It is called "temperature" in zkdecrypto and AZDecrypt code, and goes from "hot" to "cold". In cDecryptor, I keep temperature at a level where I observe a rate of tolerated acceptances at a level that I have observed to be working well, exposing it as configuration option.
Quicktrader:
1. In cDecryptor, the scoring is counting the n-grams in the solution candidate, and compute how likely it is to get this count of n-grams in a natural text of length 340. The logarithm of the likelihood is the score. Finding "the" too few times is bad, finding "the" too many times is bad, and if matching average, then it is best. If an n-gram is in the solution text that never appears in natural text, like lets say "xyz", then this is assigned a likelihood that corresponds to not having found "xyz" in a corpus of x Gigabytes of length does not allow saying if the likelihood of occurrence is really zero, or only a twice as large corpus would have had one occurrence. That avoids computing a logarithm of zero in this case.
In zkdecrypto and AZDecrypt, it is the sum of the occurrences of an n-gram in a solution candidate, multiplied by the frequency the n-gram would appear in a natural text. Finding "the" gives more score than finding "sun", and finding "the" three times, gives three times the score. To prevent over-adaptation, zkdecrypto and AZDecrypt use entropy (zkdecrypto as well IoC) in a heuristic formula to limit to meaningful text
2. See above. cDecryptor uses the most likely solution that is consistent with the contraints (symbols sequence in cipher and n-grams distribution in natural text) as the best solution. zkdecryptor and AZDecrypt take the best solution as the highest score of their heuristic formulas. I admit that their heuristic formulars work better, although I find them theoretically less convincing unless we find an explanation.
3. Yes, the first thing I did when reading about the solution was to put the transposition described in Dave’s video into the cipher, and run a solution with cDecryptor. Except a few single letters, the solution is the same (see above in this thread). For open source transparency, sll of zkdecrypto, AZDecrypt and cDecryptor are publicly available: zkdecrypto and cDecrypt on github, AZDecrypt on one of the posts here in the forum. There is no secrets.
4. I define an n-gram as a sequence of letters after stripping spaces. See the example of "the street" above.
The final step is to throw new ideas and computing power at the cipher and see the results. Until a few weeks ago, that was the state of the art.
What is a decent, clear way of rating a clear text (for an homophonic attack for instance) ?
Let’s admit it is composed of "ngram quality" and "entropy quality".
====
For the ngram part, things are pretty clear (for me !).
Let n be 3, 4 or 5.
Take a ngram database (as a text file for instance) that counts occurences of sequences of n letters in target language.
Sum all the counts of this database to s
Make a table "ngram_log_freq_table" for any ngram that gives log10(nb occurences(ngram) / n)
In the plain text, sum the nboccurence(ngram) * ngram_log_freq_table(ngram) of all *distinct* ngram present
(use a very low default value for non existent value of ngrams in the "ngram_log_freq_table")
This makes the "ngram quality"
===
On its own, this leads to the issue of repeating frequent ngrams, "overadaptation" not efficient, as explained earlier.
===
For the entropy part, things are not that clear (for me !)
Take all letters in clear. Let n be the size of the cipher (and the size of the plain)
Let freq(letter) be nb_occurrence(letter) / n
Make a table letter_log_freq_table that gives for a letter freq(letter) * log2(freq(letter)) (log10 ?)
In the plain text, sum the letter_log_freq_table(letter) of all distinct letter present.
Do not change the sign since the few entropy the better.
This makes the "entropy quality"
aabbbc -> 2 * log2(2) + 3 * log2(3) + log2(1) = 2×1 +3×1.58 + 0 = 6.74
==
Use for the overall quality of a putative text :
ngram quality + entropy_quality
Is this correct (for the entropy part) ?
Is the frequency of letters in the target language to be considered for entropy ?
Do ngram quality and entropy quality sum up on the same scale ? Since one uses log10 and the other log2 ?
Do they require some specific coefficients ?
Is this method correct ?
Sorry to be so slow
Python program (about 1000 lines)
Powerful PC
10 processes
So fater all, the explanations from Mr Reichmann were not that bad !
thanks
================================================== i like killing people because it is so much fun it i a more fun than killing wild game in the forrest because man is the moat danger tue an amal of all to kill something gives me the moat thrilling experence it is even better than getting your rocks off with a girl the best part of it i at hae when i die i will be reborn in paradice and all the i have killed will become my slaves i will not give you my name because you will try to slo i down or atop my collecting of slaves for my afterlife e be or i et emeth hp it i ================================================== index_of_coincidence=1.7010165245459363 (reference=1.73) dictionary_quality=-365.3578126654142 quality=ngram qual=-2117.388688761953 entropy qual=1842.835358974869 quality=-3960.224047736822 time taken=347.5177159309387 -------------------------- abcdefghijklmnopqrstuvwxyz 7Vef+JR)9 /#q(!= @5YcAj_ 8 z6Q MP % DT rFH G E U B OX tKI S N k ^d L l W Z p --------------------------