• Short guides to forum navigation, searching, posting, translation, alerts and notifications viewable by clicking here.
  • Türk dostlarımıza hoş geldiniz Giriş burada.
  • Scammers are running ads on Facebook and Instagram claiming a giveaway. DO NOT OPEN THESE LINKS AND LOG IN. See this thread: here
  • The Kinesis Community Forum will be down starting on Friday, Eastern Standard Time, and is expected to be back online on Monday. Big changes in the backend are coming!

Initial attempt to validate Circulation from the Kinesis blockchains using linux code

The following posts were a response to this "spreadsheet method":


Caution:
I have no real idea what I'm doing.
Completely new to this blockchain analysis.
Nevertheless the following dabblings in linux awk might help someone do actual intelligent work.

https://watcher.aboutkinesis.com/operations

kag
Manually:
set custom network (upper right) to
in Field3 of the dropdown menu
save

Manually: go to bottom, click export to CSV
WAIT

for menu SAVE AS to your hard-disk
If you miss the pop-up screen, you'll have to start over.

date output format:
linux> date +"%Y%m%e-%H%M"
20230223-1606

linux> cp stellar-export.csv stellar-export-20230223-1606.csv
linux> wc -l *
30149 stellar-export-20230223-1606.csv
30149 stellar-export.csv

number of lines in the file = row count
minus one to exclude the header

linux> head -1 stellar-export-20230223-1606.csv | sed 's/,/\n/g' | nl
1 "id"
2 "transactionSuccessful"
3 "sourceAccount"
4 "type"
5 "typeI"
6 "createdAt"
7 "transactionHash"
8 "assetType"
9 "from"
10 "to"
11 "amount"
12 "account"
13 "into"
14 "startingBalance"
15 "funder"
16 "signerKey"
17 "signerWeight"
18 "masterKeyWeight"
19 "lowThreshold"
20 "medThreshold"
21 "highThreshold"
Not all rows have data in every field.
Rows with "payment" in Field4 have 11 columns
linux> cat stellar-export-20230223-1606.csv | sed 's/"//g' | awk -F, '$4=="payment"{s1+=$11}END{printf "Total %-10.2f\n",s1}'
Total 12412576.75
Probably goobledygook but someone could adapt that to do something useful
 
Last edited by a moderator:
For the sake of completeness, here's the same work for kau, a day later
The payment number here is in Field15 (Column15). Not 11 as for kag in previous post.

Using a slighlty different output format (getting to grips with the forum software),
here's the file name I saved to:

Bash:
postpend=`date -u +"%Y-%m-%e-T%H:%M:%S-%Z"`
echo $postpend
2023-02-24-T09:37:45-UTC

linux> wc -l stellar-export-KAU-2023-02-24-T09\:37\:45-UTC.csv
82438 stellar-export-KAU-2023-02-24-T09:37:45-UTC.csv
corresponds to
Fetching of 82438 entries complete!
from
https://watcher.aboutkinesis.com/operations
Bash:
SomeTotalFromKAU_Blockchain=`cat stellar-export-KAU-2023-02-24-T09\:37\:45-UTC.csv | sed 's/"//g' | awk -F, '$4=="payment"{s1+=$15}END{printf "%-10.2f\n",s1}'`
echo $SomeTotalFromKAU_Blockchain
56603127.36

Bash:
linux> echo "scale=3; 12412576.75 + $SomeTotalFromKAU_Blockchain" | bc -l | xargs printf "%.2f\n"
69015704.11

This is a simple sum of the value found yesterday from the KAG blockchain (?) plus the equivalent from the KAU blockchain now.
It may be meaningless nevertheless this is to share simple linux bash and awk code which might inspire someone else who actually knows what they're doing to produce real analysis.
 
Columns 11 and 15
correspond to
Columns K and O in a spreadsheet like libreOffice LibreCalc
Code:
linux>
for i in {A..Z}; do echo $i; done > alphabet
nl alphabet
     1    A
     2    B
     3    C
     4    D
     5    E
     6    F
     7    G
     8    H
     9    I
    10    J
    11    K
    12    L
    13    M
    14    N
    15    O
    16    P
    17    Q
    18    R
    19    S
    20    T
    21    U
    22    V
    23    W
    24    X
    25    Y
    26    Z
 
Counting the occurrences of "sourceAccount", Field3 for each extracted file in the 2 posts above:
kag
Code:
linux> cat stellar-export-20230223-1606.csv | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 | head -15
   7784 GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
   5431 GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E
    888 GB6ZZMIBRO6RQVOCPABHNVEURN3H6VINUB2MTKZV2BJVWP5WFDBWED54
    868 GDEPDZCGVQ7U3337CM6QQU5GDDXMEXVM77WNLW3RFNP6JS45HVYW53AL
    768 GBDLFOIWRYZ7TU4I6KUY7ICAALJ7HOVENUU6NJL2IHMSXDJV427EEOBV
    232 GCZ4KXQFSANZK5PPPV2Z37O4NNQCL4XNWGM5TZSZSETSJYVR7QDVFDQU
    171 GDOJVZUE6YN7YDGXVLWEYEH6NBKPAHJQ3A663VJAOZWBLM2G3LQ6CAOC
    134 GBPW7IJL5KRDOM5RGMKVVPO6Q6DLBAZIBXV3AB5OIXKZ5MUQC4GIRWRQ
    110 GDHGUKPJ4FAPFFNAFMEE2DOKYKMLFO2K47V6APDDIJ4WOIH3M7FFXRCQ
    108 GDEYJZ33T3EGHUSS6YB6MODS66KX7BVESEPPZZLNAAWJESDF326FL7S7
     94 GBNPTJNYCG4VJ5J3RCPU44OKX3FNIDQDKWUVFMDFFPOO74OZVQQDLVZS
     94 GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I
     88 GB572UTY2ISPYYVOPIAVYZ3Q227UMXPLVPTUZBF4J6OS7BUBIJKNAHX3
     81 GCVPYATUGRKCUZAQNQN3RWWT4EQ5Z2Q7U4SQL7ZFRX7HCODEHINQBZVU
     80 GBNUYBLK3P3LYNWLUIB7RAXKX6JNXQXCWJCSP7LB5SGFLG4K472TOUEV
kau
Code:
linux> cat stellar-export-KAU-2023-02-24-T09\:37\:45-UTC.csv | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 | head -30
  33540 GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
   7030 GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
   1683 GDOVPGZ6F6DXY4MPHLLL6VFF5IOOZG5ITM7LNR535W6OGHDAPD2IJTKA
   1347 GDEPDZCGVQ7U3337CM6QQU5GDDXMEXVM77WNLW3RFNP6JS45HVYW53AL
   1042 GCLJ4RQRG3HAN3GGO4DHMHI7STA6PKOBWQJLH3NRS56Z647SLYPHPN7I
    981 GCTKTMOHRD6WRRGBXAR3ZNLHWIKXDNCJYWDKMGZQNC6T47VZ2JG7ZJBL
    975 GBW4XYJPEK7CI4TSK2T56I4TOXH3ARQQXPTVNRGZEDJNEDYP4V4OMP5I
    960 GBVNQGRZYWDRBE57TU7Z62IRLFODSIFZ3WW2QQH4V4YRSGDB6AP5VNRS
    945 GBBXN3CBWAFH5P46KJRERQHDNPWWJQXGZ76WF2H27EETPMMOBEUQ62NN
    886 GD7EOW2PIT4YPLBAP3O4SXKVA5PSRRJKYDO5IXNQDASXPKGCHZJ2TX2Q
    876 GBBIXA2YD33DKROEKSRP5MYEGBYSUKGAUQPCFYW7HHI5NHRRNYNWSIBM
    849 GCRKCT7TWFVMET5KWFOP2OHN6L57Z3U6QGXKVEULEBFE6FBUUNVEH5MA
    817 GB3LQCN3D6BCXL6P46FZBRVVMV7YDBZF3MSDHOZJFMTTQ3SBDJHY5UKV
    636 GCNAJ2UB5K3BWRYP3RF2OQY4UUF5YTKM2556T2KBJA3W5474VIQ4ZSVC
    579 GDKKDENEIVFPGZZP6ZYE5M2I47SLIO5432QYTXLZ3C4QQJ753TFYN7GK
    565 GAW2TKRSNRX46ILOQ3E6SVUFY5CGH2I34PLUIRQH3IXVWFJXQQKMLXIV
    438 GDZ3HIR4PF26T5Q6R7O5ZP4NY75SIXPDLQ3IQWII3ZZ64BFFWCXFOETK
    396 GCO3DY4YYJQAPUZJP4QOMOYBQ2QFYWH54NZ3CJYLETTG3Y7ODUCAOREU
    382 GD4R5ODYEDCBMN2KUKB5RV2NKT5LNJBWIDRV5HGYW6YUOBI2QDAQS5EH
    377 GBNPTJNYCG4VJ5J3RCPU44OKX3FNIDQDKWUVFMDFFPOO74OZVQQDLVZS
    372 GDJUZFVEU2DP5VQYRJNC5KTFZFM5XSTQ6BJI4YUCQM3SS2U4XGQWUP45
    347 GAHPAUTIZTRTMOTUJ6MPFAWFJ2SSC5S6Z5YUL7QKIUDRESKATFOZJ4HP
    281 GBBK7MGOGNWXMR65HN5AGV5KZN7IS25EHLX7UUVZVYQCD3PXTMHLFBWK
    278 GDEYJZ33T3EGHUSS6YB6MODS66KX7BVESEPPZZLNAAWJESDF326FL7S7
    265 GCE7K22RUSQ47QRQPIV4N2IZI33DIRWAV2RARIFFYEODO7HXOUQIJT7T
    262 GB55CGKHPDINPIID7PMDDHYB6LDGJQ7LLN4YSZL4WNYEV6PEKBBVA2YV
    261 GCPP5GNNP63HJLRC45D4AJ7T6QTJIEJUEG3UHYKM3RF5343GZIY7JUBO
    260 GC22K5MZTCWOQVB5B5DR5ZSIZ4P3ZOS3KCK256F3TYJL643UDW4TXQHJ
    257 GCTB3GWA4MU3G7DUAFQ2NFJN5WVFDLRQX4LDMZHY7EDHZQNTKQMSN5ZP
    221 GBLPG6OFQR34AVG2VGFL7RPEBZ7BSFTDS5X2Z7F7ZS6VYLDCMCDVC6LG
 
Nerd side-track :geek:

1)
Stellar Addresses
Taking the top guy in the kag sorted list above:
Code:
linux> echo -n GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU | wc -c
56
"-n" to avoid inserting an extra carriage return at end of string
Code:
linux> cat stellar-export-20230223-1606.csv | sed 's/"//g' | awk -F, 'NR>1 {print length($3)}' | uniq -c
  30149 56
"NR>1" skips first record with the header strings to each column (aka field; aka "$i" for awk)
All 56 characters long
That seems to match the design of Stellar
Matches the line count (= the record count) for the file:
Code:
linux> wc -l stellar-export-20230223-1606.csv
30149 stellar-export-20230223-1606.csv

2)
Transactions
Code:
linux> echo -n 236ee99f5a6fe01b62168357a5afdca75191148fda8332a2c146b6db3cbc606c | wc -c
64
64 characters long
Also seems to match the theory

3)
To look those up manually from your browser:

1) explorer.kinesis.money GUI for the address
2) explorer.kinesis.money GUI for the transaction
 
Another way to check the length is build your own ruler:

This requires a non-proportional font which is why the forum's code icon is so useful (the last one, to the far- right, on the second row here):

Screenshot from 2023-02-24 23-19-40.png



Code:
linux>
seq 1 9 > t
echo 0 >>t
cat t
cat t | paste -s -d '' > u
paste -d '' u u u u u u

123456789012345678901234567890123456789012345678901234567890
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU

for i in {1..6}; do printf "%10d\n" $i; done > t
paste -s -d "" t > v
cat v
         1         2         3         4         5         6
eye-ball it:
56
yeah :whistle:
 
Time to open @Uchiki 's kag spreadsheet above (OP)...

Go to Row 20 (just for kicks) and copy his formulae for Columns W X Y without thinking too hard like so:
Code:
linux> 
echo '
Column W
=IF(AND(E20=1, M20="GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU", N20<>"GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI",  N20<>"GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I"),O20,0)

Column X
=IF(AND(E20=0, K20="GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU", H20<>"GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI",  H20<>"GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I"),J20,0)

Column Y
=IF(AND(E20=1, M20="GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E",  OR(N20="GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI",   N20="GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU")),O20,0)' > t
Gosh. That worked.

Now extract out all those nice address strings, borrowing heavily from StackOverflow (may those wonderful altruists be blessed for all eternity)
Code:
linux> grep -o "G[^,]*" t > u
linux> cat u
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU"
GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI"
GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I")
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU"
GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI"
GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I")
GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E"
GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI"
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU"))
wow. that worked. linux is all about trial and error

Clean that up a bit:
Code:
linux> cat u | tr -d '[:punct:]' | sort -u
GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I
GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
Grab previous work from a post higher up and write to file list
Code:
cat stellar-export-20230223-1606.csv | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 > list
Create a nice long grep string (grep = search in linux) to hit that list file, just created.
Purpose? To identify the important addresses on the Blockchain:
Code:
linux> grep 'GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI\|GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I\|GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E\|GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU' list
   7784 GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
   5431 GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E
     94 GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I
     44 GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
Looks as if the first 2 are the important ones.
The last 2 are probably related to row 20 = record 20

The first 2 are most likely, 2 of the 3 that @Uchiki was writing about:
  1. minting emission account
  2. root account
  3. inflation account
I have no idea which is which yet. I am a noob in matters blockchain.
But hey, we are progressing, are we not?
Data munging can be fun :unsure:
 
A repeat of the above.
Tidied up in one place.
Getting our tool-box ready to do actual work...
Code:
#
# KAG, KAU in alphabetical order
#

KAG_extract=../Blockchain-Exploration/KAG/stellar-export-20230223-1606.csv
KAU_extract=../Blockchain-Exploration/KAU/stellar-export-KAU-2023-02-24-T09\:37\:45-UTC.csv

head -1 $KAG_extract | sed 's/,/\n/g' > KAG-column-headers
head -1 $KAU_extract | sed 's/,/\n/g' > KAU-column-headers

for i in {A..Z}; do echo $i; done > alphabet

# delete last 5 lines/ (letters of alphabet not used)
#
linux> paste -d " " alphabet KAG-column-headers KAU-column-headers | head -n -5 | nl | xargs printf "%-3d%-3s%25s%25s\n"
1  A                      id                    id
2  B      transactionSuccessful    transactionSuccessful
3  C              sourceAccount            sourceAccount
4  D                       type                     type
5  E                      typeI                    typeI
6  F                  createdAt                createdAt
7  G            transactionHash          transactionHash

8  H                  assetType                  account
9  I                       from                     into
10 J                         to          startingBalance
11 K                     amount                   funder
12 L                    account                assetType
13 M                       into                     from
14 N            startingBalance                       to
15 O                     funder                   amount

16 P                  signerKey                signerKey
17 Q               signerWeight             signerWeight
18 R            masterKeyWeight          masterKeyWeight
19 S               lowThreshold             lowThreshold
20 T               medThreshold             medThreshold
21 U              highThreshold            highThreshold

#
# 2 extra carriage returns addedd manually to help quick reading
#
# 21 columns. Same as for KAG extract. Yet Columns 8-15 have different content
# Presumaby Derek's watcher node has code with different logic for the CSV extract
#
 
Assigning bash variables for readability later.
Code:
echo $KA{G,U}_extract | xargs printf "%s\n%s\n"
../Blockchain-Exploration/KAG/stellar-export-20230223-1606.csv
../Blockchain-Exploration/KAU/stellar-export-KAU-2023-02-24-T09:37:45-UTC.csv

cat $KAG_extract | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 > KAG-reference

linux> cat $KAG_extract | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 | head -5
   7784 GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
   5431 GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E
    888 GB6ZZMIBRO6RQVOCPABHNVEURN3H6VINUB2MTKZV2BJVWP5WFDBWED54
    868 GDEPDZCGVQ7U3337CM6QQU5GDDXMEXVM77WNLW3RFNP6JS45HVYW53AL
    768 GBDLFOIWRYZ7TU4I6KUY7ICAALJ7HOVENUU6NJL2IHMSXDJV427EEOBV
#
# Extract the top 2 to shell variables
#
cat $KAG_extract | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 | head -2 > t
KAG_address1=`cut -c9- t | head -1`
KAG_address2=`cut -c9- t | tail -1`
echo $KAG_address{1..2} | xargs printf "%s\n%s\n"
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E



# Same again Sam for KAU

cat $KAU_extract | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 > KAU-reference

cat $KAU_extract | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 | head -5
  33540 GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
   7030 GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
   1683 GDOVPGZ6F6DXY4MPHLLL6VFF5IOOZG5ITM7LNR535W6OGHDAPD2IJTKA
   1347 GDEPDZCGVQ7U3337CM6QQU5GDDXMEXVM77WNLW3RFNP6JS45HVYW53AL
   1042 GCLJ4RQRG3HAN3GGO4DHMHI7STA6PKOBWQJLH3NRS56Z647SLYPHPN7I

cat $KAU_extract | sed 's/"//g' | awk -F, '{print $3}' | sort | uniq -c | sort -b -r -n -k1 | head -2 > t
KAU_address1=`cut -c9- t | head -1`
KAU_address2=`cut -c9- t | tail -1`
echo $KAU_address{1..2} | xargs printf "%s\n%s\n"
GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
 
In what will hopefully follow, assume the following half-dozen linux bash variables are set
Code:
echo $KA{G,U}_extract | xargs printf "%s\n%s\n"
../Blockchain-Exploration/KAG/stellar-export-20230223-1606.csv
../Blockchain-Exploration/KAU/stellar-export-KAU-2023-02-24-T09:37:45-UTC.csv

echo $KAG_address{1..2} | xargs printf "%s\n%s\n"
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E

echo $KAU_address{1..2} | xargs printf "%s\n%s\n"
GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
 
We can now slowly but systematically practise our moves.
How to
pass a bash variable to awk
use a simple condition statement with that variable
Code:
head -1000 $KAG_extract | sed 's/"//g' | awk -F, -v a=$KAG_address1 'a==$9 {print;}' | nl
     1    104657117369077761,true,GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU,payment,1,2023-02-06T17:34:22Z,85ddf1d3efa1c458dcaceab583b989b23c518c2bbbff27717b5b558e96e87a20,native,GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU,GDT24Q5ZBGAZWGJCHE3NOUATGZYKK7ZRGRJ3MPYDMLKM3REXLYXAADIE,200.0000000,,,,,,,,,,
     2    104451813637361665,true,GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU,payment,1,2023-02-03T17:45:21Z,e53a498fc2f9eb3adbbddb4da14e586a1621e6693c011f1c728aeb698e2321e9,native,GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU,GCVIU4FBDFZKVCS2RNOUSRGTNINWLPPLHESFVSO4HNZNPCDSI3ERRXUG,200.0000000,,,,,,,,,,
     3    103392988234780673,true,GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU,payment,1,2023-01-19T06:01:31Z,89200061fd2998ee0dd195f07a709e0cb9fd755ef46f2477e6bdb3700beb3c86,native,GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU,GDIA2QDTCYOTVRLLJLXBIK7VGIGQ7JW3UAACWRLSKIXXFETUQBWFQ6P7,200.0000000,,,,,,,,,,
 
While sniffing around, saw a GAUCI.
We dig:
Code:
KAG_address3=GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI

cat $KAG_extract | sed 's/"//g' | awk -F, -v a1=$KAG_address1 -v a3=$KAG_address3 '($9==a3 && $10==a1) {sum+=$11} END {printf "%-10.2f\n",sum}'
3076312.15
That calculation may be meaningless but it shows how we can combine condition statements in awk. We add that syntax construct to our toolbox.
From web explorer, pick out a transation hash id at random (just about) to toy with it.
Sniff further.
Code:
KAG_t=2c13f0d1fff82823c69c4752b211d563b27640fa1eb47e8089587aa7836d7497

cat $KAG_extract | sed 's/"//g' | awk -F, -v t1=$KAG_t '($7==t1) {print;}'
76940147219763201,true,GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI,payment,1,2022-01-11T05:54:00Z,2c13f0d1fff82823c69c4752b211d563b27640fa1eb47e8089587aa7836d7497,native,GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI,GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU,200000.0000000,,,,,,,,,,
A GAUCI GAUCI GCGT
in columns
3 9 10
I have no idea what it means. But we can see it in the CSV extract made a week ago or so:
Code:
cat $KAG_extract | sed 's/"//g' | awk -F, -v t1=$KAG_t '($7==t1) {printf "%s\n%s\n%s\n", $3, $9, $10}'
GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
We are slowly making better use of our tool-box...
What is this GAUCI fellow? I don't know but shuttling between web explorer and flat file, we can experiment with the numbers:
Code:
grep GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI KAG-reference
grep GAUCI KAG-reference
linux> grep GAUCI KAG-reference
     44 GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
Not in Top 4...
Let's see what Explorer has to say:
99996909554.4433304
?
Let's reverse-think what that may be:
Code:
99996909554.4433304
bc <<< "scale=12; 100 * 10^9"
100000000000
bc <<< "scale=12; 100 * 10^9 -1"
99999999999
bc <<< "scale=12; 100 * 10^9 - 99996909554"
3090446
100 billion minus about 3 million
Probably some sort of System Account?
I'll leave it there for now
 
To save straining the eyes too much:
migtt as well make use of what we have already done to increase readability:
Bash:
cat $KAG_extract | sed 's/"//g' | awk -F, -v t1=$KAG_t '($7==t1) {print;}' | sed 's/,/\n/g' | sed '/^$/d' | nl
cat $KAG_extract | sed 's/"//g' | awk -F, -v t1=$KAG_t '($7==t1) {print;}' | sed 's/,/\n/g' | sed '/^$/d' | nl > t

cat KAG-column-headers | sed '12,$d'
cat KAG-column-headers | sed '12,$d' > tt
paste tt t | xargs printf "%-20s%-3d%s\n"
gives output:
Code:
linux> paste tt t | xargs printf "%-25s%-3d%65s\n"
id                    1                                                  76940147219763201
transactionSuccessful    2                                                               true
sourceAccount            3           GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
type                     4                                                            payment
typeI                    5                                                                  1
createdAt                6                                               2022-01-11T05:54:00Z
transactionHash          7   2c13f0d1fff82823c69c4752b211d563b27640fa1eb47e8089587aa7836d7497
assetType                8                                                             native
from                     9           GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
to                       10          GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
amount                   11                                                    200000.0000000
 
The 6 transaction types...
Experimenting...
Field4 aka Column4 in both KAG and KAU extract files from the Watcher node
Code:
cat $KAG_extract | sed 's/"//g' | awk -F, '{print $4}' | sort -u
account_merge
create_account
inflation
payment
set_options
type

cat $KAU_extract | sed 's/"//g' | awk -F, '{print $4}' | sort -u
account_merge
create_account
inflation
payment
set_options
type
Set up a flat file for next step:
Code:
echo '
account_merge
create_account
inflation
payment
set_options
type' > t

cat t | sed 1d > tt

#
# Test loop construct:
#
linux> while IFS= read x; do echo $x; sleep 3s; done < tt
account_merge
create_account
inflation
payment
set_options
type

#
# Test assign bash variable in loop
#
while IFS= read x; do TryThis=`echo $x`; echo $TryThis; sleep 3s; done < tt
account_merge
create_account
inflation
payment
set_options
type
Count occurrences of each:
Code:
while IFS= read x; do   TryThis=`echo $x`;   echo $TryThis;   cat $KAG_extract | sed 's/"//g' | awk -F, -v t=$TryThis '($4 == t) {print;}' | wc -l;   sleep 3s; done < tt
account_merge
12803
create_account
13774
inflation
28
payment
3421
set_options
123
type
1
Same for KAU CSV extract file:
Code:
while IFS= read x
do
  TryThis=`echo $x`
  echo $TryThis
  cat $KAU_extract | sed 's/"//g' | awk -F, -v t=$TryThis '($4 == t) {print;}' | wc -l
  sleep 3s
done < tt
account_merge
36540
create_account
37412
inflation
27
payment
8336
set_options
123
type
1
wc -l $KAU_extract
82438 ../Blockchain-Exploration/KAU/stellar-export-KAU-2023-02-24-T09:37:45-UTC.csv
Conclusions?
No idea. Too soon. We're analyzing the data, doing reverse forensics from the data upwards to the blockchain (?). A form of reverse engineering.
Until next time.
 
TL;DR How to set up 2*4 linux bash environment variables so that we can process them with awk later.

Just discovered today that the 4 main accounts are listed for each of KAU and KAG here
By clicking on any word there with an underlying URL, you get
Kinesis well-known accounts
like so:

Screenshot from 2023-03-07 20-22-13.png
Let's process those on semi-automatic so that we have code to set-up our environment variables:
Screen-scrape:
Code:
echo 'Root account       GDTYNME5HX......6SCKC7J3PD      GAUCIFE37F......RMWGF76LTI
Emission account        GDIENNQ3BX......AZIGX2JCCS      GCGTMT2X6N......QU3C6AIUGU
Hot Wallet account      GBUBOKEFUN......OOI5KSVCZS      GBTYCT2VVW......LE6YDX4A7E
Inflation account       GCO75U2EVO......ASSKT4MA3B      GBBVUAMR3C......ZNNBT6FT7I' > t

cat t
Root account    GDTYNME5HX......6SCKC7J3PD    GAUCIFE37F......RMWGF76LTI
Emission account    GDIENNQ3BX......AZIGX2JCCS    GCGTMT2X6N......QU3C6AIUGU
Hot Wallet account    GBUBOKEFUN......OOI5KSVCZS    GBTYCT2VVW......LE6YDX4A7E
Inflation account    GCO75U2EVO......ASSKT4MA3B    GBBVUAMR3C......ZNNBT6FT7I

nl t
     1    Root account    GDTYNME5HX......6SCKC7J3PD    GAUCIFE37F......RMWGF76LTI
     2    Emission account    GDIENNQ3BX......AZIGX2JCCS    GCGTMT2X6N......QU3C6AIUGU
     3    Hot Wallet account    GBUBOKEFUN......OOI5KSVCZS    GBTYCT2VVW......LE6YDX4A7E
     4    Inflation account    GCO75U2EVO......ASSKT4MA3B    GBBVUAMR3C......ZNNBT6FT7
Code:
#
# Extract main account names
#
cat t | cut -f1 | sed 's/account/Account/g' | sed 's/ //g' > tt
nl tt
     1    RootAccount
     2    EmissionAccount
     3    HotWalletAccount
     4    InflationAccount

We could of course then do some tedious manual copy/ paste. Let's do it in a more automated fashion with some help from an Internet search then stackexchange and stackoverflow lower down:
Code:
https://unix.stackexchange.com/questions/645206/how-to-get-string-between-starting-with-x-to-ending-with-y-using-shell

linux> S1="10.10.10.10 (XXXX,ABC123.sub.domain.Xcom,XXX,XX - abc123 ) ANY"
linux> echo $S1
10.10.10.10 (XXXX,ABC123.sub.domain.Xcom,XXX,XX - abc123 ) ANY

linux> echo $S1 | grep -o "ABC.*com"
ABC123.sub.domain.Xcom

# Target syntax:
#
grep -o "GDTYNME5HX.*6SCKC7J3PD" $KAU_extract | awk -F, '{print $1}' | sed 's/"//g' | sort -u
GDTYNME5HX3FCFDS4D3R3LTVH3DFLSB5HWVTV3VVL4PBGQ6SCKC7J3PD
Code:
# The string columns can be processed nicely:
#
cat t | cut -f2 | cut -c1-10 > ttt1
cat t | cut -f2 | cut -c17- > ttt2
nl ttt1 ttt2
     1    GDTYNME5HX
     2    GDIENNQ3BX
     3    GBUBOKEFUN
     4    GCO75U2EVO
     5    6SCKC7J3PD
     6    AZIGX2JCCS
     7    OOI5KSVCZS
     8    ASSKT4MA3B

paste -d " " ttt1 ttt2 | sed 's/ /\.\*/g' > ttt3

linux> paste -d " " ttt1 ttt2 | sed 's/ /\.\*/g' > ttt3
linux> nl ttt3
     1    GDTYNME5HX.*6SCKC7J3PD
     2    GDIENNQ3BX.*AZIGX2JCCS
     3    GBUBOKEFUN.*OOI5KSVCZS
     4    GCO75U2EVO.*ASSKT4MA3B

# That was to set up the grep string
Code:
while IFS= read x
do
  grep -o "$x" $KAU_extract | awk -F, '{print $1}' | sed 's/"//g' | sort -u
  sleep 3s
done < ttt3 > ttt4

linux> nl ttt4
     1    GDTYNME5HX3FCFDS4D3R3LTVH3DFLSB5HWVTV3VVL4PBGQ6SCKC7J3PD
     2    GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
     3    GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
     4    GCO75U2EVO3HKGYO2PUO2ZO7CEL52JPQL5WYUHHKPEORQSASSKT4MA3B
To set up the environment variables using a script:
dot space dot [not obvious!]
Code:
https://stackoverflow.com/questions/1464253/global-environment-variables-in-a-shell-script

sed 's/^/KAU_/g' tt > tt-prime
paste -d= tt-prime ttt4 > KAU-set-main-account-symbols.sh
chmod +x KAU-set-main-account-symbols.sh
. ./KAU-set-main-account-symbols.sh

linux> nl KAU-set-main-account-symbols.sh
     1    KAU_RootAccount=GDTYNME5HX3FCFDS4D3R3LTVH3DFLSB5HWVTV3VVL4PBGQ6SCKC7J3PD
     2    KAU_EmissionAccount=GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
     3    KAU_HotWalletAccount=GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
     4    KAU_InflationAccount=GCO75U2EVO3HKGYO2PUO2ZO7CEL52JPQL5WYUHHKPEORQSASSKT4MA3B
Code:
for i in `paste -d " " -s  tt`; do   echo "echo \$KAU_"$i; done > KAU-check.sh

linux> for i in `paste -d " " -s  tt`; do   echo "echo \$KAU_"$i; done > KAU-check.sh
linux> chmod +x KAU-check.sh
linux> . ./KAU-check.sh
GDTYNME5HX3FCFDS4D3R3LTVH3DFLSB5HWVTV3VVL4PBGQ6SCKC7J3PD
GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
GCO75U2EVO3HKGYO2PUO2ZO7CEL52JPQL5WYUHHKPEORQSASSKT4MA3B

# confirmed

# The shell variables for the 4 main accounts are now set

# better copy it nowto avoid this rigmarole every time


We store scripts for future use:

. ./KAU-set-main-account-symbols.sh
. ./KAU-check.sh

Do the exact equivalent for KAG.
[Work not shown]

4 scripts attached below in standard ZIP file using:

Code:
linux> zip -j SetUpEnvironment.zip \
KAU-set-main-account-symbols.sh \
KAU-check.sh \
KAG-set-main-account-symbols.sh \
KAG-check.sh

linux> zip -sf SetUpEnvironment.zip
Archive contains:
  KAU-set-main-account-symbols.sh
  KAU-check.sh
  KAG-set-main-account-symbols.sh
  KAG-check.sh
Total 4 entries (816 bytes)

Another way to check the symbols are set properly:
Code:
echo $KAU_{RootAccount,EmissionAccount,HotWalletAccount,InflationAccount} | xargs printf "%s\n%s\n"

GDTYNME5HX3FCFDS4D3R3LTVH3DFLSB5HWVTV3VVL4PBGQ6SCKC7J3PD
GDIENNQ3BXCTB74ZYCQAGXCY7KTGFBZGHRMUVF3ZLIW6SMAZIGX2JCCS
GBUBOKEFUNF4ZZQA7QJCGYF54U4N2T5VV2QAN7RWQCVS75OOI5KSVCZS
GCO75U2EVO3HKGYO2PUO2ZO7CEL52JPQL5WYUHHKPEORQSASSKT4MA3B

echo $KAG_{RootAccount,EmissionAccount,HotWalletAccount,InflationAccount} | xargs printf "%s\n%s\n"

GAUCIFE37F4KQ5F6QPNSZ75QKRQTNRCF32FZNUXMCXUFSKRMWGF76LTI
GCGTMT2X6NUV6ABEOAOSDI2YQ7FXQOQYKYA7KVZQ5ID67GQU3C6AIUGU
GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E
GBBVUAMR3CYNQKMNHVWCMUQVE3XQIL3WM5GSP5D6SCECKIZNNBT6FT7I
Until next time...
 

Attachments

  • SetUpEnvironment.zip
    1.2 KB · Views: 1
When working with spreadsheets, it may be useful to flatten them out systematically to a comma separated CSV file (a flat file).
We can then work the CSV file with standard linux CLI tools (awk, head, cat, sed ...).

This is particularly relevant when reading the CSV files from Derek's Watcher node and uchiki's two spreadsheets above.
"From" and "To" fields are not always where you expect them.
The odd-man out is Derek's KAG CSV file (see below).

To preserve sanity when cycling between different CSV types, list the column headers in one table.
Here's how:

First install gnumeric (useful utilities). Doesn't take a minute.
For more info,
# under linux/ ubuntu:
Code:
sudo apt-get install gnumeric
Convert the spreadsheets to flat files
Code:
ssconvert Template\ KAG\ blockchain\ analysis.xlsx uchiki-KAG.csv
ssconvert Template\ KAU\ blockchain\ analysis.xlsx uchiki-KAU.csv
PS1="uchiki> "
uchiki> ls -nr
Number the columns/ fields (useful for awk field names: $1,...$n)
List out alphabet (to quickly read spreadsheet headers without going into LibreOffice)
Code:
for i in {A..Z}; do echo $i; done > alphabet
for i in AA AB AC
do
  echo $i
done
for i in AA AB AC; do   echo $i; done >> alphabet
nl alphabet > t5
Set bash symbols and CHECK
Code:
KAG_extract=../Blockchain-Exploration/KAG/stellar-export-20230223-1606.csv
KAU_extract=../Blockchain-Exploration/KAU/stellar-export-KAU-2023-02-24-T09\:37\:45-UTC.csv
KAG_uchiki=uchiki-KAG.csv
KAU_uchiki=uchiki-KAU.csv

uchiki> set -o posix ; set | grep "KA[G,U]_" | head -4
KAG_extract=../Blockchain-Exploration/KAG/stellar-export-20230223-1606.csv
KAG_uchiki=uchiki-KAG.csv
KAU_extract=../Blockchain-Exploration/KAU/stellar-export-KAU-2023-02-24-T09:37:45-UTC.csv
KAU_uchiki=uchiki-KAU.csv
The fun part:
Code:
head -1 $KAG_extract | sed 's/,/\n/g' | sed 's/\"//g' > t1
head -1 $KAG_uchiki | sed 's/,/\n/g'  | sed 's/\"//g' > t2
head -1 $KAU_extract | sed 's/,/\n/g' | sed 's/\"//g' > t3
head -1 $KAU_uchiki | sed 's/,/\n/g'  | sed 's/\"//g' > t4

uchiki> pr -m -t t5 t[1-4] | expand
     1  A             "id"       id            "id"               id
     2  B             "transactionS transactionSu "transactionS transactionSu
     3  C             "sourceAccoun sourceAccount "sourceAccoun sourceAccount
     4  D             "type"        type          "type"        type
     5  E             "typeI"       typeI         "typeI"       typeI
     6  F             "createdAt"   createdAt     "createdAt"   createdAt
     7  G             "transactionH transactionHa "transactionH transactionHa
     8  H             "assetType"   account       "account"     account
     9  I             "from"        into          "into"        into
    10  J             "to"          startingBalan "startingBala startingBalan
    11  K             "amount"      funder        "funder"      funder
    12  L             "account"     assetType     "assetType"   assetType
    13  M             "into"        from          "from"        from
    14  N             "startingBala to            "to"          to
    15  O             "funder"      amount        "amount"      amount
    16  P             "signerKey"   signerKey     "signerKey"   signerKey
    17  Q             "signerWeight signerWeight  "signerWeight signerWeight
    18  R             "masterKeyWei masterKeyWeig "masterKeyWei masterKeyWeig
    19  S             "lowThreshold lowThreshold  "lowThreshold lowThreshold
    20  T             "medThreshold medThreshold  "medThreshold medThreshold
    21  U             "highThreshol highThreshold "highThreshol highThreshold
    22  V                                                       
    23  W                           Minted1                     Minted1
    24  X                           Minted2                     "Minted 2"
    25  Y                           Redeemed                    Redeemed
    26  Z                                                       
    27  AA                          "Cumulative M               "Cumulative M
    28  AB                          "Cumulative R               "Cumulative R
    29  AC                          "Net Circulat               "Net Circulat
 
A sample record of Type 1 (payment) from uchiki's XLSX sheet converted to a flat CSV file as explained higher up:
Code:
     1    A id 1.02883557868835E+17
     2    B transactionSuccessful true
     3    C sourceAccount GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E
     4    D type payment
     5    E typeI 1
     6    F createdAt 2023-01-11T19:14:10Z
     7    G transactionHash cdc1e59d1f682ebe9c26eaf107d9894c1b69d5b5bfe7f7c087cdf6a5ccfcebc9
     8    H account 
     9    I into 
    10    J startingBalance 
    11    K funder 
    12    L assetType native
    13    M from GBTYCT2VVWURNU23ZSR3IPSXU6BRWT3ELIOQJAJOKGIHCLLE6YDX4A7E
    14    N to GDEPDZCGVQ7U3337CM6QQU5GDDXMEXVM77WNLW3RFNP6JS45HVYW53AL
    15    O amount 97.72679
    16    P signerKey 
    17    Q signerWeight 
    18    R masterKeyWeight 
    19    S lowThreshold 
    20    T medThreshold 
    21    U highThreshold 
    22    V  
    23    W Minted1 0
    24    X Minted2 0
    25    Y Redeemed 0
    26    Z  
    27    AA Cumulative Minted 95316.098
    28    AB Cumulative Redeemed 138600
    29    AC Net Circulation -43283.902
My aim is to minimize thinking about format.
To help us all focus on the data.
I am mostly using a similar bag of tools over and over again.
Keeping it playful.
Code:
for i in {A..Z}; do echo $i; done > alphabet
for i in AA AB AC; do   echo $i; done >> alphabet 
nl alphabet > t1

head -1 ../uchiki\ libreCalc\ sheets/uchiki-KAG.csv | sed 's/\"//g' | sed 's/,/\n/g' > t2
sed '1d' ../uchiki\ libreCalc\ sheets/uchiki-KAG.csv | sed 's/\"//g' | awk -F, '($5==1) {print $0}' | head -1 | sed 's/,/\n/g' > t3

paste -d " " t[1-3]
 

Translate

Back
Top