Compare commits

...

191 Commits

Author SHA1 Message Date
Vivek Teega
836de91fb3 Bug fixes + cleanup 2024-08-16 06:28:54 +00:00
Sai Raj
4293c79e63 bug fixes 2024-08-12 21:31:41 -04:00
Sai Raj
a71c15fe6c Adding flags to indicate status of backend
- API responses now have a warning message if backend is still syncing
- API v2 responses now have HTTP codes 200, 206, 400, 404, 500, 503
200: response is ok
206: reponse is ok, but backend is still syncing, thus data might not be final
404: data not found error
503: data not found, but backend is still syncing
400: client request error
500: internal server error
2024-08-12 21:09:08 -04:00
Sai Raj
63e5a0da87 Multi-threaded fetches for scanning blocks
- speedup scanBlocks using batch fetch threading from API
- fixes for check_for_reorg and process restart when reorg happens
2024-08-09 01:02:18 -04:00
Sai Raj
1b9f75ebb4 thread bug fix for sqlite 2024-08-03 17:37:15 -04:00
Sai Raj
40f43aefb2 bug fix 2024-08-03 16:06:30 -04:00
Sai Raj
fff6824b30 Threading and bug fixes 2024-08-03 15:59:03 -04:00
Sai Raj
c8881c23de Uploading the main file for combine system
main file to process the config file and start the backend process and then API server
2024-07-18 05:59:44 -04:00
Sai Raj
ac5913d3a1 rename tokenscanner py file to backend_main.py 2024-07-18 05:35:26 -04:00
Sai Raj
0c5365b510 Update flosights ro blockbook 2024-07-18 05:32:38 -04:00
Sai Raj
a3859016a2 merge config files for api and scanner 2024-07-18 05:32:10 -04:00
Sai Raj
9d2b63ce77 rename ranchimallflo_api.py to api_main.py 2024-07-18 05:29:40 -04:00
Sai Raj
31ebfb9278 Update for merging api and scanner 2024-07-18 05:28:55 -04:00
Sai Raj
cc7b51d37a Update requirements.txt 2024-07-15 02:30:30 -04:00
Sai Raj
46f2b5696b Update for combining with api 2024-07-15 02:23:21 -04:00
Sai Raj
134553c104 merge bug fix 2024-07-15 01:22:33 -04:00
Sai Raj
772cd369d6 Merge branch 'blockbook-dev' of https://github.com/ranchimall/flo-token-tracking into blockbook-dev 2024-07-15 01:18:41 -04:00
Sai Raj
1e2e688453 Adding API server files 2024-07-15 01:18:10 -04:00
Sai Raj
937a12cc50 move source files to src/backend 2024-07-15 01:17:45 -04:00
f369ef7e87 Small bug fixes 2024-07-14 06:58:36 +00:00
Sai Raj
5449ee83a5 Exception handle
- Adding exception handle for all fetch and db connections within a loop with retry timeout
- blocked all sys.exit to prevent system from shutting down. instead retry after 1 hr or 30 mins
2024-07-10 04:15:03 -04:00
Sai Raj
8464d48b0a Adding utility fns for reorg
Adding functions for auto-rollback on blockchain reorg
2024-07-07 01:21:55 -04:00
Sai Raj
8b873f364e error fixes 2024-07-07 01:19:53 -04:00
Sai Raj
37cee58465 Auto initialize data storage
no longer requires --reset on 1st time running
--reset can still be used to force delete all data and restart the scanner
2024-07-07 01:06:49 -04:00
38d90659be Critical bug fix : Validating is an address is associated with any Smart Contracts previously 2024-03-18 09:01:04 +00:00
0dd32c7803 Merge branch 'upgrade/blockbook' of https://github.com/ranchimall/flo-token-tracking into upgrade/blockbook 2024-02-16 10:11:37 +00:00
4fa48c808c Added code to handle misbehaving blockbook API 2024-02-16 10:11:22 +00:00
e3b197cd4b Fix conditional check 2024-02-16 09:58:09 +00:00
4234abe59a Cleaning pdb statements 2023-11-22 16:48:10 +00:00
3d2654149b Decimal fix for SQL queries( plain SQL + SQL Alchemy ) 2023-11-20 19:37:27 +00:00
Vivek Teega
19198ddc82
Merge pull request #44 from ranchimall/upgrade/blockbook-decimalmod
Converting calculations to Decimal calculations
2023-11-20 11:00:15 +05:30
c6cd23fc9d Making sure Oracle address cannot be contract address 2023-11-18 16:16:01 +00:00
e9bc8546c2 Converting calculations to Decimal calculations 2023-11-11 05:00:22 +00:00
a83ed33a99 Further changes for blockbook shifting:
1. Removed socketio
2. Added support for blockbook's websocket
3. Fixed a bug reg transaction processing. Blockbook doesn't return floData key in a transaction's details API response if floData is not present in the transaction. Updated code to handle this
2023-10-27 11:13:09 +00:00
bb77a9723b Added more folders to gitignore 2023-10-27 11:07:56 +00:00
08168cac5e Bug fixes with tokenSum calculation right before a contract is triggered 2023-10-09 07:52:48 +00:00
78b05be204 Cleanup : to make the scan more faster 2023-10-05 20:40:10 +00:00
dfc2a85da4 Added a check to make sure payeeAddress is not contractAddress 2023-10-05 20:36:46 +00:00
4e22b846fb Fix internal transactions + Added another column referenceTxHash to contractwinners table 2023-09-29 12:05:04 +00:00
b7c9496eae Added new helper scripts 2023-09-11 11:33:37 +00:00
cfaf2e2ed4 Fixed bug which was ignoring internal transactions 2023-09-05 17:18:36 +00:00
396e916a5b Fix in logic of rebuild_withAPI 2023-08-26 21:23:21 +00:00
6a07e2bbc1 Added a util rebuild script which takes fresh data from the API too 2023-08-26 20:51:50 +00:00
6f6f3ce420 Added rebuild code 2023-08-22 10:49:34 +00:00
e011f7208a Remove typo 2023-08-22 10:41:19 +00:00
9c7387795f Pushing a base layer to so new branchlve merge issues during migration 2023-08-16 03:50:30 +00:00
44aa304f61 Fixed rollback 2023-08-12 17:51:13 +00:00
63a3c2344c Changes required for blockbook migration 2023-08-09 14:48:56 +00:00
Vivek Teega
f62b3d196b
Merge pull request #39 from ranchimall/bugfix/swap-statef-testing
Some checks failed
Test flodata parsing / build (push) Has been cancelled
Bugfix/swap statef testing
2023-06-11 23:10:57 +05:30
f20b8d9e8f Added tokenAddressMapping in the missing places 2023-06-11 17:39:43 +00:00
fbc534b477 Bugfix : reference before assignment 2023-06-09 20:19:26 +00:00
b396bc1f74 Critical bug fix for infinite token addressBalance 2023-06-09 19:55:49 +00:00
Vivek Teega
9eda14ae6a
Update test_parsing.yml 2023-05-18 14:58:13 +05:30
Vivek Teega
e9ad3307f6 Refactoring + cleanup 2023-05-18 09:26:13 +00:00
Vivek Teega
a8e885f729 Updated functions to find committee list and fetch latest oracle price in token swap to accomodate changes in flosight API 2023-05-14 12:54:48 +00:00
Vivek Teega
4a254336b8 Added subtype and unix_expiryTime
* Added subtype and unix_expiryTime to the contractStructure
* Fixed the automated tests for parsing with subtype and unix_expiryTime information
2023-05-10 11:26:37 +00:00
Vivek Teega
5a7ce5bd99 Cleanup: Removing pdb traces 2023-05-07 22:18:47 +00:00
Vivek Teega
993bf6e1b8 Storing unix expiryTime as part of contract structure 2023-05-07 22:12:48 +00:00
Vivek Teega
fa9798d1f0 Removing an extra debugger line which was left 2023-05-06 13:36:55 +00:00
Vivek Teega
6d56c2a1e0 Fixed bug with token swap dynamic price calculation & added a regex version of parsing date time 2023-05-04 23:28:10 +00:00
Vivek Teega
dce543284c Added check to stop users from making contracts on an address which has/had tokens before 2023-05-01 17:10:09 +00:00
Vivek Teega
355364badb Fixed bugs with external trigger contract's data stored in time_actions table and activecontracts table 2023-04-30 17:48:32 +00:00
Vivek Teega
e94506bf14 Fixed issues with time related contract triggers and dummy data showing up in activecontracts and time_actions table + cleanup 2023-04-30 11:05:16 +00:00
Vivek Teega
796e84cc05 Merge branch 'swap-statef-testing' of https://github.com/ranchimall/flo-token-tracking into swap-statef-testing 2023-04-29 22:14:53 +00:00
Vivek Teega
b84456602d More automated tests 2023-04-29 22:13:37 +00:00
Vivek Teega
bf4684cbe7 Merge branch 'swap-statef-testing' of https://github.com/ranchimall/flo-token-tracking into swap-statef-testing 2023-04-29 22:12:09 +00:00
Vivek Teega
0966070239 Added more automated tests 2023-04-29 22:11:40 +00:00
Vivek Teega
cd1b36a246 Critical bug fix for fetching time action related active smart contracts 2023-04-29 22:10:55 +00:00
15bc31c4a7 Fix for multisig support 2023-04-26 15:58:23 +00:00
Vivek Teega
e0d013cd63 Added automated test for contract trigger 2023-04-24 18:27:00 +00:00
Vivek Teega
a35b35d22e Making sure the sender address is oracle_address in fetchDynamicSwapPrice() 2023-04-24 18:26:36 +00:00
Vivek Teega
ba80ae4e6a Bug fix swap participation + added automated test for deposit parsing
* Added a parsing test got Deposit
* Bug Fix with Swap's participant address
2023-04-17 10:13:42 +00:00
Vivek Teega
d794e65667 Fix further bugs in token swap deposit 2023-04-14 01:15:00 +00:00
Vivek Teega
752f999ec8 Fixed deposit return for token swap 2023-04-13 15:01:24 +00:00
Vivek Teega
2132dd94fe Fixed bug with deposit return 2023-04-11 12:29:51 +00:00
42dc2d6e98 Added test for infinite token parsing 2023-04-08 20:36:59 +00:00
989251127c Fixed confusion between time vs blocktime
### Made all the database columns related to time of a transaction's confirmation as 'time'
There were instances of using blocktime instead of time in the older code. But Flosight API was using time variable common across all its results. To maintain consistency now all database columns related to time/blocktime will be called "time"

This change has affected models.py and tracktokens_smartcontracts.py file
2023-04-07 08:00:55 +00:00
Vivek Teega
073927bccd Preventing oracle address from participating in the contract 2023-03-28 09:00:01 +00:00
Vivek Teega
20c0a4bf76 Fix minor bug in dynamic swap price calculation 2023-03-28 08:44:21 +00:00
Vivek Teega
497e7937d1 Fixed bugs in token swap contract's deposit return 2023-03-27 16:55:25 +00:00
Vivek Teega
b82e35153f Fixed bugs in dynamic price processing 2023-03-27 13:57:40 +00:00
Vivek Teega
7abafd2c2c Fixed bug : parsing and storing of oracle address 2023-03-27 13:53:20 +00:00
Vivek Teega
7cf7883b59 Updated regex and tests for tokenamount parsing 2023-03-22 11:19:57 +00:00
Vivek Teega
c4be19058f Updated regex and tests for tokenamount parsing 2023-03-22 11:19:32 +00:00
Vivek Teega
ea3985cb28 Fixing workflow file : automated test for parsing.py 2023-03-19 14:28:05 +00:00
Vivek Teega
342ded906e Setting up automated testing for parsing.py 2023-03-19 13:49:37 +00:00
Vivek Teega
41753a03c3 Fix for infiniteToken parsing bug which was introduced with -ve and 0 number checks for token system 2023-03-09 16:31:37 +00:00
Vivek Teega
02de71e618 Cleanup and updating of flosight link 2023-03-01 11:56:36 +00:00
Vivek Teega
a173cf6ac3 Changes: NFT, Token Swap, Rollback script
* Fixed bugs in NFT creation and transfer
* Added functionality to Token swap contract: It can have dynamic pricing which can be picked up from a blockchain address
* Rollback script was ignoring nft tokens, now its included
2023-02-27 12:32:08 +00:00
Vivek Teega
acffba0698 Added functionality to add/remove a committee member from the list based on the time
Its possible that a committee member provided the right triggers for x number of contract before going rogue. Removal of the committee member after going rogue should not affect the previous contract's results.

* Transaction comparision with blocktime has been implement when creating committee list
* Order of execution of addition and removal is opinionated now to maintain consistency. Addition will be first, removal will be later
2023-02-16 15:26:06 +00:00
Vivek Teega
e599caa9d4 Contract committee list calculation should only consider transactions whose input address is the admin FLO address 2023-02-16 13:40:12 +00:00
Vivek Teega
528223fec7 External trigger Committee addresses shifted to blockchain
* The committee addresses who are responsible for one-time-trigger Smart Contract's triggers have been shifted to the blockchain.
* Added an APP ADMIN ID field who will be responsible for appointing committee members
2023-02-16 08:59:19 +00:00
Vivek Teega
901026ccdd Added checks to categorize negative contract and token amounts as noise 2023-02-12 11:49:36 +00:00
Vivek Teega
2da2c555f4 Workaround for the case when contract creation is rejected but Sqlite3 database exists 2023-02-12 07:37:09 +00:00
Vivek Teega
a26223f8e1 Accepting both camel case and snake case for the accepting token & selling token in Token Swap contract 2023-02-08 13:01:22 +00:00
Vivek Teega
30b3d20630 Storing senderAddress and recevierAddress of a transaction as part of transactionDetails 2023-02-06 13:47:06 +00:00
Vivek Teega
138e53bb06 Added blocktime column in transactionHistory and refactored those database operations into a function 2023-01-29 20:41:26 +00:00
Vivek Teega
22bca78569 Fix for internal triggers not having txid 2023-01-24 16:37:58 +00:00
Vivek Teega
743df4d18f Added functions to reduce repeatability 2023-01-23 09:48:54 +00:00
Vivek Teega
4511158a43 Refactored closing/expiring a contract into a function 2022-12-06 10:49:23 +00:00
Vivek Teega
26bb8004ee Refactored RejectedContractTransactionHistory entries into a function 2022-12-04 11:06:55 +00:00
Vivek Teega
0fcfeae966 Handle 0 participation contract transfers 2022-12-03 20:21:26 +00:00
Vivek Teega
6d6161ed83 Fixing bugs with logic of time based triggers in time_actions table 2022-12-01 11:17:34 +00:00
Vivek Teega
b30e8fd875 Storing contract history in multiline to accomodate rollback 2022-11-27 06:07:53 +00:00
Vivek Teega
8b112bf0b3 Update contract status in time_actions table after external committee trigger 2022-11-23 20:08:17 +00:00
Vivek Teega
9dd69065c8 Merged local trigger contracts 2022-11-23 16:11:16 +00:00
Vivek Teega
86625c7a80 Cleanup 2022-11-17 12:00:50 +00:00
Vivek Teega
87fe48358e 1.1.2 Added new table ContractWinners for SmartContracts to accomodate rollback 2022-11-17 11:46:28 +00:00
Vivek Teega
18384e8eef 1.1.1 Addition of multiple payouts in internal trigger contract 2022-11-13 12:04:08 +00:00
Vivek Teega
b327ca5d58 Changes fotokenswap & stateF addition to contracts 2022-10-29 10:51:57 +00:00
Vivek Teega
a77fed3c1b
Update requirements.txt 2022-07-23 02:35:15 +05:30
Vivek Teega
430141cb77
Update requirements.txt 2022-07-22 19:23:20 +05:30
Vivek Teega
170a48f40d
Update parsing.py 2022-07-17 17:33:41 +05:30
Vivek Teega
90439ea2f0 pybtc -> pyflo 2022-07-17 11:48:12 +00:00
Vivek Teega
703013265a Converted pybtc to pyflo 2022-07-17 11:26:39 +00:00
Vivek Teega
f8c22cd571 Fixed bugs with data directory 2022-07-17 11:00:51 +00:00
Vivek Teega
cd9fb1131f 1.1.0 Separated data folders from the script 2022-07-17 08:22:39 +00:00
Vivek Teega
cf451c0257 Minor fix: removed pdb.set_trace() 2022-05-25 12:06:03 +00:00
Vivek Teega
1fc35f289f Fixed : Infinite token rejection bug 2022-05-25 12:03:46 +00:00
Vivek Teega
15dba443b2 1.0.14 Important bug fixes
- One time event userchoice contract was not being recognised by the scanner | fixed
- contractAmount specified as part of contract structure was not being recognised | fixed
- Added check in rollback script to make sure the rollback block is smaller than current block
2022-04-03 08:39:22 +00:00
Vivek Teega
d659efb298 Bug fixes in rollback and Infinite token creation 2022-04-01 20:13:21 +00:00
Vivek Teega
acc3858b27 1.0.13 Removal of bug in special character word parsing 2022-03-24 07:37:12 +00:00
Vivek Teega
e2b6ef1123 Adding db_reference in latestTransactions table 2022-03-08 09:08:44 +00:00
Vivek Teega
6a7fd83ffd Added more transaction types for latestCache.db 2022-02-25 10:03:35 +00:00
Vivek Teega
5c3a1f4536 Changes to updateLatestTransaction so it can take unique transactionType as argument 2022-02-24 13:07:20 +00:00
Vivek Teega
798b7ad21c 1.0.12 Completed latest version of token swap 2022-02-20 10:54:43 +00:00
Vivek Teega
e93e0ae10d Helper scripts : Addition of convert_db for bootstrap and changes in test_rebuild 2022-02-16 12:25:50 +00:00
Vivek Teega
d63495a878 Update: Time triggers, TimeActions
1. Consolidated all time triggers into one function - new logic in main script, TimeAction table added to system.db
2. Changes in code to convert RAW SQL to SQL Alchemy queries
2022-02-16 12:24:54 +00:00
Vivek Teega
cf70cfd066 Changes in parsing to add tokenswap price type predetermined|determined & continuos|continuous event 2022-02-02 12:38:49 +00:00
Vivek Teega
14c2f048fd 1.0.11 Newer block based logic for scanning and deleting 2022-02-02 11:10:47 +05:30
Vivek Teega
ac89bc1af8 1.0.10 Managing orphaned pid in ConsumedTable & deleting TransferLogs 2022-02-01 12:10:59 +05:30
Vivek Teega
e1143269ea 1.0.9 ConsumedTable doesn't get any changes after transfer 2022-01-31 21:24:57 +05:30
Vivek Teega
1a06347c08 Update test_rebuild latestCache 2022-01-31 14:18:57 +05:30
Vivek Teega
18e8cfaaf4 Update rollback changes 2022-01-31 13:06:30 +05:30
Vivek Teega
1e387114a4 Updating changes 2022-01-30 18:53:31 +05:30
Vivek Teega
f3918207be Update changes 2022-01-28 14:17:17 +05:30
Vivek Teega
43f3a91107 1.0.8 New rollforward and rollback scripts + storing blockNumber data in activeTable now 2022-01-27 21:05:39 +05:30
Vivek Teega
53782d9473 Updating progress 2022-01-23 21:56:38 +05:30
Vivek Teega
db9887679c Fixed typo for __name__ == "__main__" 2022-01-20 06:59:54 +00:00
Vivek Teega
c3c423429b 1.0.7 Added new column to token database to calculate address balances quickly 2022-01-17 20:56:48 +05:30
Vivek Teega
eaad8d88b8 Update changes 2022-01-17 16:13:49 +05:30
Vivek Teega
43ca91258b 1.0.6 Bootstrap-rebuild from latestCache db 2022-01-14 18:06:48 +05:30
Vivek Teega
6ce6f75a0e Merge branch 'token-swap' of https://github.com/ranchimall/flo-token-tracking into token-swap 2022-01-14 18:04:19 +05:30
Vivek Teega
29c6019c15 Test rebuild along with database changes 2022-01-14 17:36:59 +05:30
Vivek Teega
4920bc5486 Uploading latest changes 2022-01-14 12:04:49 +00:00
Vivek Teega
32c7494504 Removing further test code 2022-01-13 16:45:10 +05:30
Vivek Teega
f9e4b5115b Commenting test code in parsing.py 2022-01-13 16:43:18 +05:30
Vivek Teega
1307b0605b Added a missing bracket 2022-01-13 16:41:18 +05:30
Vivek Teega
714afa4ccd Added logic for infinite token 2022-01-13 16:36:09 +05:30
Vivek Teega
c111b73c82 Added parsing for NFTs and Infinite tokens 2022-01-12 20:27:12 +05:30
Vivek Teega
b19ffdfecc Refactored SQLAlchemy raw sql into 1 function 2022-01-11 20:45:57 +05:30
Vivek Teega
fc2e8378a5 Refactored SQLAlchemy ORM database commands to 1 function 2022-01-11 20:05:53 +05:30
Vivek Teega
2ba852f2a1 Smart Contract db check refactoring 2022-01-11 17:27:26 +05:30
Vivek Teega
23db3656aa Further refactoring of code functions 2022-01-11 16:42:34 +05:30
Vivek Teega
a61d21817d 1.0.5 Added renamed the older parser and added the new parser 2022-01-11 13:04:52 +05:30
Vivek Teega
5abd4262e1 Refactoring processBlock & adding smart contract plan 2022-01-10 20:54:02 +05:30
Vivek Teega
1873330b6d before new changes 2022-01-10 20:42:34 +05:30
Vivek Teega
33969458ef 1.0.4 Finished basics of handling all outputreturn functions 2022-01-10 17:45:45 +05:30
Vivek Teega
1321385999 Update checkpoint 2022-01-10 13:47:59 +05:30
Vivek Teega
ebbd381177 Saving changes 2022-01-10 12:02:11 +05:30
Vivek Teega
41ef416e26 Updating progress 2022-01-08 20:50:17 +05:30
Vivek Teega
8fe97766f0 Saving changes 2022-01-08 20:13:41 +05:30
Vivek Teega
0b34ac38dc Updating progress 2022-01-08 19:31:54 +05:30
Vivek Teega
07580c6502 Update progress 2022-01-08 17:30:39 +05:30
Vivek Teega
0ef022dbd4 Conflict resolution for one-time-event incorporation 2022-01-07 17:43:33 +05:30
Vivek Teega
465f9b4222 Changes in outputreturn and input classifier 2022-01-07 17:02:31 +05:30
Vivek Teega
716ba95b5d Improvement on resolving token system conflict 2022-01-06 18:48:46 +05:30
Vivek Teega
8f49bfd610 1.0.3 Token creation and transfer classification 2022-01-06 18:21:25 +05:30
Vivek Teega
99f089152c Merge branch 'token-swap' of https://github.com/ranchimall/flo-token-tracking into token-swap 2022-01-06 08:47:56 +00:00
Vivek Teega
17a9dc6984 Update latest code 2022-01-06 08:35:44 +00:00
93f6c9540b
Update planning.py 2022-01-06 12:48:48 +05:30
3d64e141da
Update planning.py 2022-01-06 12:48:01 +05:30
Vivek Teega
b71b8d41ce Merge branch 'token-swap' of https://github.com/ranchimall/flo-token-tracking into token-swap 2022-01-06 06:55:27 +00:00
Vivek Teega
7318ed3e31 Added input classifier 2022-01-06 06:49:14 +00:00
6715311bf1
Update parser_function_definitions.py 2022-01-06 12:06:07 +05:30
Vivek Teega
e3094ad67a Plan update 2022-01-05 15:59:18 +00:00
Vivek Teega
7073405c85 conflict matrix added 2022-01-05 15:50:59 +00:00
Vivek Teega
05e840e88f Test updates 2022-01-05 15:16:34 +00:00
Vivek Teega
baaafd460b Code update 2022-01-05 14:42:47 +00:00
Vivek Teega
6ed3eaed09 Added first categorization function 2022-01-05 09:41:34 +00:00
Vivek Teega
8fcd54beb3 Saving progress 2022-01-05 09:28:31 +00:00
Vivek Teega
f817021f51 backup for files with details of the code refactoring 2022-01-04 10:47:09 +00:00
Vivek Teega
a2ad56b625 1.0.3 newMultiRequest function 2022-01-03 11:01:32 +00:00
Vivek Teega
6cae4a785e Cleaned multiple mentions of the API calls 2022-01-03 09:53:26 +00:00
Vivek Teega
47391a0641 Code cleanup - API Urls are being consolidated 2022-01-03 08:32:51 +00:00
7d162cd4ed
Update tracktokens-smartcontracts.py 2022-01-03 12:58:39 +05:30
Vivek Teega
0de0fdfd9a Removed localapi sources 2022-01-03 07:13:46 +00:00
Vivek Teega
41c4078db9 remove config.ini 2021-12-15 11:19:18 +00:00
Vivek Teega
a6ad599c8f Checkin update 2021-12-15 11:18:03 +00:00
Vivek Teega
98d0dbc81d 1.0.2 Addition of Token Swap participation code 2021-11-19 17:07:21 +05:30
Vivek Teega
1b27d0b31a 1.0.1 Token Swap Contract 2021-11-19 15:26:51 +05:30
35 changed files with 12206 additions and 3774 deletions

31
.github/workflows/test_parsing.yml vendored Normal file
View File

@ -0,0 +1,31 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python
name: Test flodata parsing
on:
push:
branches: [ "swap-statef-testing" ]
pull_request:
branches: [ "swap-statef-testing" ]
permissions:
contents: read
jobs:
build:
runs-on: self-hosted
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.8
uses: actions/setup-python@v3
with:
python-version: "3.8"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install arrow==1.1.0 pyflo-lib==2.0.9 requests==2.25.0
- name: Test with unittest
run: |
python -m unittest tests/test_parsing.py

5
.gitignore vendored
View File

@ -10,6 +10,11 @@ config.ini
config.py
*.log
py3/
py3.9.0/
py3.9/
__pycache__/
*.pyc
.vscode/
error-notes.txt
snippets*
helper-files/

View File

@ -1 +0,0 @@
3.9.0

View File

@ -1,57 +1,3 @@
# FLO Token & Smart Contract System
## Important versions and their hashes
The python script scans the FLO Blockchain for Token and Smart Contract activity and creates/updates local SQLite databases accordingly.
`339dac6a50bcd973dda4caf43998fc61dd79ea68`
The legacy token and smart contract system running currently on the server
`41c4078db98e878ecef3452007893136c531ba05` ==> WORKING VERSION | Token swap branch
The latest version with token swap smart contract and token transfer with the following problems:
1. Parsing module is not able to detect token creation and transfer floData
2. The smart contract system is not moving forward because it is not able to detect token databases as they are created when run form scratch, however it is working with old created token databases
`89d96501b9fcdd3c91c8900e1fb3dd5a8d8684c1`
Docker-compatibility branch is needed right now because Docker image made for flo-token-tracking required some changes which have been made in that branch.
## How to start the system
1. Create a virtual environment with python3.7 and activate it
```
python3.7 -m venv py3.7
source py3.7/bin/activate
```
2. Install python packages required for the virtual environment from `pip3 install -r requirements.txt`
3. Setup config files with the following information
For testnet
```
# config.ini
[DEFAULT]
NET = testnet
FLO_CLI_PATH = /usr/local/bin/flo-cli
START_BLOCK = 740400
# config.py
committeeAddressList = ['oVwmQnQGtXjRpP7dxJeiRGd5azCrJiB6Ka']
sseAPI_url = 'https://ranchimallflo-testnet.duckdns.org/'
```
For mainnet
```
# config.ini
[DEFAULT]
NET = mainnet
FLO_CLI_PATH = /usr/local/bin/flo-cli
START_BLOCK = 3387900
# config.py
committeeAddressList = ['FRwwCqbP7DN4z5guffzzhCSgpD8Q33hUG8']
sseAPI_url = 'https://ranchimallflo.duckdns.org/'
```
4. If running for the first time, run `python3.7 tracktokens-smartcontracts.py --reset` otherwise run `python3.7 tracktokens-smartcontracts.py`
If you want to listen to RanchiMall's Token Tracker scanner's events you have to subscribe to Ranchimallflo API's end point `/sse`
Reference - https://ably.com/topic/server-sent-events

23
app.py
View File

@ -1,23 +0,0 @@
import os
from flask import Flask, jsonify
app = Flask(__name__)
@app.route('/')
def hello_world():
return 'Hello, World!'
@app.route('/getmarkerlist')
def marker_list():
dblist = os.listdir("databases/")
dbdict = {}
for idx, item in enumerate(dblist):
dbdict[idx] = item[:-3]
return jsonify(dbdict)
app.run(debug=True)

View File

@ -1 +1,30 @@
committeeAddressList = [<committeeAddress>]
# general configs
DATA_PATH = "/home/production/Dev/flo-token-tracker"
APP_ADMIN = "oWooGLbBELNnwq8Z5YmjoVjw8GhBGH3qSP"
# API configs
apiUrl = 'https://blockbook.ranchimall.net/api'
FLO_DATA_DIR = "/home/production/.flo"
HOST = "localhost"
PORT = 8080
debug_status = False
sse_pubKey = '<public key in the format of pybtc python library>'
# your apilayer.net access key
apilayerAccesskey = '<accesskey>'
NET = "mainnet"
START_BLOCK = 3387900
BLOCKBOOK_NETURL = "https://blockbook.ranchimall.net/"
TOKENAPI_SSE_URL = "https://ranchimallflo.duckdns.org/"
MAINNET_BLOCKBOOK_SERVER_LIST = ["https://blockbook.ranchimall.net/"]
TESTNET_BLOCKBOOK_SERVER_LIST = ["https://blockbook-testnet.ranchimall.net/"]
IGNORE_BLOCK_LIST = [902446]
#IGNORE_TRANSACTION_LIST = "b4ac4ddb51188b28b39bcb3aa31357d5bfe562c21e8aaf8dde0ec560fc893174"
"""?NOT USED?
FLO_CLI_PATH = "/usr/local/bin/flo-cli"
"""

View File

@ -1,4 +0,0 @@
[DEFAULT]
NET = mainnet
FLO_CLI_PATH = /usr/local/bin/flo-cli
START_BLOCK = 3387900

View File

@ -1,3 +0,0 @@
cd /home/production/Desktop/flo-token-tracking/
python3 tracktokens-smartcontracts.py

39
main.py Normal file
View File

@ -0,0 +1,39 @@
import sys
import time
import threading
from src.api.api_main import start_api_server
from src.backend.backend_main import start_backend_process
import config as config
from src.flags import set_run_start
DELAY_API_SERVER_START = 60 # 1 min
def convert_to_dict(module):
context = {}
for setting in dir(module):
if not setting.startswith("__"):
context[setting] = getattr(module, setting)
return context
if __name__ == "__main__":
# parse the config file into dict
_config = convert_to_dict(config)
set_run_start()
# start the backend process (token scanner). pass reset=True if --reset is in command-line args
if "--reset" in sys.argv or "-r" in sys.argv:
t1 = threading.Thread(target=lambda:start_backend_process(config=_config, reset=True))
else:
t1 = threading.Thread(target=lambda:start_backend_process(config=_config))
t1.start()
# sleep until backend is started, so that API server can function correctly (TODO: sleep until backend process returns some flag indicating its started)
#time.sleep(DELAY_API_SERVER_START)
# start the API server
start_api_server(config=_config)
#t2 = threading.Thread(target=lambda: start_api_server(config=_config))
#t2.start()
t1.join()
#t2.join()

View File

@ -1,390 +0,0 @@
import configparser
import re
import arrow
config = configparser.ConfigParser()
config.read('config.ini')
marker = None
operation = None
address = None
amount = None
months = {'jan': 1,
'feb': 2,
'mar': 3,
'apr': 4,
'may': 5,
'jun': 6,
'jul': 7,
'aug': 8,
'sep': 9,
'oct': 10,
'nov': 11,
'dec': 12}
def isTransfer(text):
wordlist = ['transfer', 'send', 'give'] # keep everything lowercase
textList = text.split(' ')
for word in wordlist:
if word in textList:
return True
return False
def isIncorp(text):
wordlist = ['incorporate', 'create', 'start'] # keep everything lowercase
textList = text.split(' ')
for word in wordlist:
if word in textList:
return True
return False
def isSmartContract(text):
textList = text.split(' ')
for word in textList:
if word == '':
continue
if word.endswith('@') and len(word) != 1:
return word
return False
def isSmartContractPay(text):
wordlist = text.split(' ')
if len(wordlist) != 2:
return False
smartContractTrigger = re.findall(r"smartContractTrigger:'.*'", text)[0].split('smartContractTrigger:')[1]
smartContractTrigger = smartContractTrigger[1:-1]
smartContractName = re.findall(r"smartContractName:.*@", text)[0].split('smartContractName:')[1]
smartContractName = smartContractName[:-1]
if smartContractTrigger and smartContractName:
contractconditions = {'smartContractTrigger': smartContractTrigger, 'smartContractName': smartContractName}
return contractconditions
else:
return False
def extractAmount(text, marker):
count = 0
returnval = None
splitText = text.split('userchoice')[0].split(' ')
for word in splitText:
word = word.replace(marker, '')
try:
float(word)
count = count + 1
returnval = float(word)
except ValueError:
pass
if count > 1:
return 'Too many'
return returnval
def extractMarker(text):
textList = text.split(' ')
for word in textList:
if word == '':
continue
if word.endswith('#') and len(word) != 1:
return word
return False
def extractInitTokens(text):
base_units = {'thousand': 10 ** 3, 'million': 10 ** 6, 'billion': 10 ** 9, 'trillion': 10 ** 12}
textList = text.split(' ')
counter = 0
value = None
for idx, word in enumerate(textList):
try:
result = float(word)
if textList[idx + 1] in base_units:
value = result * base_units[textList[idx + 1]]
counter = counter + 1
else:
value = result
counter = counter + 1
except:
for unit in base_units:
result = word.split(unit)
if len(result) == 2 and result[1] == '' and result[0] != '':
try:
value = float(result[0]) * base_units[unit]
counter = counter + 1
except:
continue
if counter == 1:
return value
else:
return None
def extractAddress(text):
textList = text.split(' ')
for word in textList:
if word == '':
continue
if word[-1] == '$' and len(word) != 1:
return word
return None
def extractContractType(text):
operationList = ['one-time-event*'] # keep everything lowercase
count = 0
returnval = None
for operation in operationList:
count = count + text.count(operation)
if count > 1:
return 'Too many'
if count == 1 and (returnval is None):
returnval = operation
return returnval
def extractUserchoice(text):
result = re.split('userchoice:\s*', text)
if len(result) != 1 and result[1] != '':
return result[1].strip().strip('"').strip("'")
else:
return None
def brackets_toNumber(item):
return float(item[1:-1])
def extractContractConditions(text, contracttype, marker, blocktime):
rulestext = re.split('contract-conditions:\s*', text)[-1]
# rulelist = re.split('\d\.\s*', rulestext)
rulelist = []
numberList = re.findall(r'\(\d\d*\)', rulestext)
for idx, item in enumerate(numberList):
numberList[idx] = int(item[1:-1])
numberList = sorted(numberList)
for idx, item in enumerate(numberList):
if numberList[idx] + 1 != numberList[idx + 1]:
print('Contract condition numbers are not in order')
return None
if idx == len(numberList) - 2:
break
for i in range(len(numberList)):
rule = rulestext.split('({})'.format(i + 1))[1].split('({})'.format(i + 2))[0]
rulelist.append(rule.strip())
if contracttype == 'one-time-event*':
extractedRules = {}
for rule in rulelist:
if rule == '':
continue
elif rule[:10] == 'expirytime':
expirytime = re.split('expirytime[\s]*=[\s]*', rule)[1].strip()
try:
expirytime_split = expirytime.split(' ')
parse_string = '{}/{}/{} {}'.format(expirytime_split[3], months[expirytime_split[1]],
expirytime_split[2], expirytime_split[4])
expirytime_object = arrow.get(parse_string, 'YYYY/M/D HH:mm:ss').replace(tzinfo=expirytime_split[5])
blocktime_object = arrow.get(blocktime)
if expirytime_object < blocktime_object:
print(
'Expirytime of the contract is earlier than the block it is incorporated in. This incorporation will be rejected ')
return None
extractedRules['expiryTime'] = expirytime
except:
print('Error parsing expiry time')
return None
for rule in rulelist:
if rule == '':
continue
elif rule[:14] == 'contractamount':
pattern = re.compile('[^contractamount\s*=\s*].*')
searchResult = pattern.search(rule).group(0)
contractamount = searchResult.split(marker)[0]
try:
extractedRules['contractAmount'] = float(contractamount)
except:
print("Contract amount entered is not a decimal")
elif rule[:11] == 'userchoices':
pattern = re.compile('[^userchoices\s*=\s*].*')
conditions = pattern.search(rule).group(0)
conditionlist = conditions.split('|')
extractedRules['userchoices'] = {}
for idx, condition in enumerate(conditionlist):
extractedRules['userchoices'][idx] = condition.strip()
elif rule[:25] == 'minimumsubscriptionamount':
pattern = re.compile('[^minimumsubscriptionamount\s*=\s*].*')
searchResult = pattern.search(rule).group(0)
minimumsubscriptionamount = searchResult.split(marker)[0]
try:
extractedRules['minimumsubscriptionamount'] = float(minimumsubscriptionamount)
except:
print("Minimum subscription amount entered is not a decimal")
elif rule[:25] == 'maximumsubscriptionamount':
pattern = re.compile('[^maximumsubscriptionamount\s*=\s*].*')
searchResult = pattern.search(rule).group(0)
maximumsubscriptionamount = searchResult.split(marker)[0]
try:
extractedRules['maximumsubscriptionamount'] = float(maximumsubscriptionamount)
except:
print("Maximum subscription amount entered is not a decimal")
elif rule[:12] == 'payeeaddress':
pattern = re.compile('[^payeeAddress\s*=\s*].*')
searchResult = pattern.search(rule).group(0)
payeeAddress = searchResult.split(marker)[0]
extractedRules['payeeAddress'] = payeeAddress
if len(extractedRules) > 1 and 'expiryTime' in extractedRules:
return extractedRules
else:
return None
return None
def extractTriggerCondition(text):
searchResult = re.search('\".*\"', text)
if searchResult is None:
searchResult = re.search('\'.*\'', text)
return searchResult
return searchResult
# Combine test
def parse_flodata(string, blockinfo, netvariable):
# todo Rule 20 - remove 'text:' from the start of flodata if it exists
if string[0:5] == 'text:':
string = string.split('text:')[1]
# todo Rule 21 - Collapse multiple spaces into a single space in the whole of flodata
# todo Rule 22 - convert flodata to lowercase to make the system case insensitive
nospacestring = re.sub(' +', ' ', string)
cleanstring = nospacestring.lower()
# todo Rule 23 - Count number of words ending with @ and #
atList = []
hashList = []
for word in cleanstring.split(' '):
if word.endswith('@') and len(word) != 1:
atList.append(word)
if word.endswith('#') and len(word) != 1:
hashList.append(word)
# todo Rule 24 - Reject the following conditions - a. number of # & number of @ is equal to 0 then reject
# todo Rule 25 - If number of # or number of @ is greater than 1, reject
# todo Rule 25.a - If a transaction is rejected, it means parsed_data type is noise
# Filter noise first - check if the words end with either @ or #
if (len(atList) == 0 and len(hashList) == 0) or len(atList) > 1 or len(hashList) > 1:
parsed_data = {'type': 'noise'}
# todo Rule 26 - if number of # is 1 and number of @ is 0, then check if its token creation or token transfer transaction
elif len(hashList) == 1 and len(atList) == 0:
# Passing the above check means token creation or transfer
incorporation = isIncorp(cleanstring)
transfer = isTransfer(cleanstring)
# todo Rule 27 - if (neither token incorporation and token transfer) OR both token incorporation and token transfer, reject
if (not incorporation and not transfer) or (incorporation and transfer):
parsed_data = {'type': 'noise'}
# todo Rule 28 - if token creation and not token transfer then it is confirmed that is it a token creation transaction
# todo Rule 29 - Extract total number of tokens issued, if its not mentioned then reject
elif incorporation and not transfer:
initTokens = extractInitTokens(cleanstring)
if initTokens is not None:
parsed_data = {'type': 'tokenIncorporation', 'flodata': string, 'tokenIdentification': hashList[0][:-1],
'tokenAmount': initTokens}
else:
parsed_data = {'type': 'noise'}
# todo Rule 30 - if not token creation and is token transfer then then process it for token transfer rules
# todo Rule 31 - Extract number of tokens to be sent and the address to which to be sent, both data is mandatory
elif not incorporation and transfer:
amount = extractAmount(cleanstring, hashList[0][:-1])
if None not in [amount] and amount!='Too many':
parsed_data = {'type': 'transfer', 'transferType': 'token', 'flodata': string,
'tokenIdentification': hashList[0][:-1],
'tokenAmount': amount}
else:
parsed_data = {'type': 'noise'}
# todo Rule 32 - if number of # is 1 and number of @ is 1, then process for smart contract transfer or creation
elif len(hashList) == 1 and len(atList) == 1:
# Passing the above check means Smart Contract creation or transfer
incorporation = isIncorp(cleanstring)
transfer = isTransfer(cleanstring)
# todo Rule 33 - if a confusing smart contract command is given, like creating and sending at the same time, or no
if (not incorporation and not transfer) or (incorporation and transfer):
parsed_data = {'type': 'noise'}
# todo Rule 34 - if incorporation and not transfer, then extract type of contract, address of the contract and conditions of the contract. Reject if any of those is not present
elif incorporation and not transfer:
contracttype = extractContractType(cleanstring)
contractaddress = extractAddress(nospacestring)
contractconditions = extractContractConditions(cleanstring, contracttype, marker=hashList[0][:-1],
blocktime=blockinfo['time'])
if config['DEFAULT']['NET'] == 'mainnet' and blockinfo['height'] < 3454510:
if None not in [contracttype, contractconditions]:
parsed_data = {'type': 'smartContractIncorporation', 'contractType': contracttype[:-1],
'tokenIdentification': hashList[0][:-1], 'contractName': atList[0][:-1],
'contractAddress': contractaddress[:-1], 'flodata': string,
'contractConditions': contractconditions}
else:
parsed_data = {'type': 'noise'}
else:
if None not in [contracttype, contractaddress, contractconditions]:
parsed_data = {'type': 'smartContractIncorporation', 'contractType': contracttype[:-1],
'tokenIdentification': hashList[0][:-1], 'contractName': atList[0][:-1],
'contractAddress': contractaddress[:-1], 'flodata': string,
'contractConditions': contractconditions}
else:
parsed_data = {'type': 'noise'}
# todo Rule 35 - if it is not incorporation and it is transfer, then extract smart contract amount to be locked and userPreference. If any of them is missing, then reject
elif not incorporation and transfer:
# We are at the send/transfer of smart contract
amount = extractAmount(cleanstring, hashList[0][:-1])
userChoice = extractUserchoice(cleanstring)
contractaddress = extractAddress(nospacestring)
if None not in [amount, userChoice]:
parsed_data = {'type': 'transfer', 'transferType': 'smartContract', 'flodata': string,
'tokenIdentification': hashList[0][:-1],
'operation': 'transfer', 'tokenAmount': amount, 'contractName': atList[0][:-1],
'userChoice': userChoice}
if contractaddress:
parsed_data['contractAddress'] = contractaddress[:-1]
else:
parsed_data = {'type': 'noise'}
# todo Rule 36 - Check for only a single @ and the substring "smart contract system says" in flodata, else reject
elif (len(hashList) == 0 and len(atList) == 1):
# Passing the above check means Smart Contract pays | exitcondition triggered from the committee
# todo Rule 37 - Extract the trigger condition given by the committee. If its missing, reject
triggerCondition = extractTriggerCondition(cleanstring)
if triggerCondition is not None:
parsed_data = {'type': 'smartContractPays', 'contractName': atList[0][:-1],
'triggerCondition': triggerCondition.group().strip()[1:-1]}
else:
parsed_data = {'type': 'noise'}
else:
parsed_data = {'type': 'noise'}
return parsed_data

View File

@ -1,16 +1,36 @@
aiofiles
APScheduler==3.9.1
arrow==1.1.0
bidict==0.21.2
blinker==1.4
certifi==2021.5.30
cffi==1.14.5
requests==2.25.0
chardet==3.0.4
Click==7.0
greenlet==1.1.0
h11==0.9.0
h2==3.1.1
hpack==3.0.0
Hypercorn==0.8.2
hyperframe==5.2.0
idna==2.10
itsdangerous==1.1.0
Jinja2==2.10.1
MarkupSafe==1.1.1
multidict==4.5.2
priority==1.3.0
pycparser==2.20
pyflo-lib==2.0.9
python-dateutil==2.8.1
python-engineio==3.14.2
python-socketio==4.6.1
secp256k1==0.13.2
Quart==0.10.0
Quart-CORS==0.2.0
requests==2.25.0
six==1.16.0
sortedcontainers==2.1.0
SQLAlchemy==1.4.18
toml==0.10.0
typing-extensions==3.7.4
urllib3==1.26.5
websockets==12.0
wsproto==0.15.0

12
src/api/.gitignore vendored Normal file
View File

@ -0,0 +1,12 @@
.vscode/
__pycache__/
*.swp
config.py
.idea/
py3.7/
py3/
py3.8/
*.db
*.code-workspace
*.log
py*/

1068
src/api/README.md Normal file

File diff suppressed because it is too large Load Diff

3114
src/api/api_main.py Normal file

File diff suppressed because it is too large Load Diff

150
src/api/fetchRates.py Normal file
View File

@ -0,0 +1,150 @@
import requests
import json
import sqlite3
import os
from config import *
import requests
import json
import sqlite3
import os
from config import *
import time
RETRY_TIMEOUT_DB = 60 # 60 sec
RETRY_TIMEOUT_REQUEST = 10 * 60 # 10 minsd
prices = {}
# 1. fetch old price data if its there, else create an empty db
def connect_database():
if not os.path.isfile("prices.db"):
# create an empty db
while True:
try:
conn = sqlite3.connect('prices.db')
c = conn.cursor()
c.execute('''CREATE TABLE ratepairs
(id integer primary key, ratepair text, price real)''')
c.execute("INSERT INTO ratepairs(ratepair, price) VALUES ('BTCBTC', 1)")
c.execute("INSERT INTO ratepairs(ratepair, price) VALUES ('BTCUSD', -1)")
c.execute("INSERT INTO ratepairs(ratepair, price) VALUES ('BTCINR', -1)")
c.execute("INSERT INTO ratepairs(ratepair, price) VALUES ('USDINR', -1)")
c.execute("INSERT INTO ratepairs(ratepair, price) VALUES ('FLOUSD', -1)")
conn.commit()
conn.close()
except:
print(f"Unable to create prices.db, retrying in {RETRY_TIMEOUT_DB} sec")
time.sleep(RETRY_TIMEOUT_DB)
else:
break
# load old price data
# load older price data
global prices
while True:
try:
conn = sqlite3.connect('prices.db')
c = conn.cursor()
ratepairs = c.execute('select ratepair, price from ratepairs')
ratepairs = ratepairs.fetchall()
for ratepair in ratepairs:
ratepair = list(ratepair)
prices[ratepair[0]] = ratepair[1]
except:
print(f"Unable to read prices.db, retrying in {RETRY_TIMEOUT_DB} sec")
time.sleep(RETRY_TIMEOUT_DB)
else:
break
# 2. fetch new price data
def fetch_newprice():
global prices
while True:
try:
# apilayer
response = requests.get(f"http://apilayer.net/api/live?access_key={apilayerAccesskey}")
try:
price = response.json()
prices['USDINR'] = price['quotes']['USDINR']
break
except ValueError:
print('Json parse error. retrying in {RETRY_TIMEOUT_REQUEST} sec')
time.sleep(RETRY_TIMEOUT_REQUEST)
except:
print(f"Unable to fetch new price data, retrying in {RETRY_TIMEOUT_REQUEST} sec")
time.sleep(RETRY_TIMEOUT_REQUEST)
def fetch_bitpay_or_coindesk():
# bitpay
global prices
while True:
print("Trying bitpay API")
try:
response = requests.get('https://bitpay.com/api/rates')
bitcoinRates = response.json()
for currency in bitcoinRates:
if currency['code'] == 'USD':
prices['BTCUSD'] = currency['rate']
elif currency['code'] == 'INR':
prices['BTCINR'] = currency['rate']
except ValueError:
print("Json parse error in bitpay")
except:
print(f"Unable to fetch bitpay")
else:
break # if data is accrued from bitpay, break from loop and procees to next process
print("Trying coindesk API")
# coindesk
try:
response = requests.get('https://api.coindesk.com/v1/bpi/currentprice.json')
price = response.json()
prices['BTCUSD'] = price['bpi']['USD']['rate']
except ValueError:
print(f'Json parse error in coindesk')
except:
print(f"Unable to fetch coindesk")
else:
break # if data is accrued from coindesk, break from loop and procees to next process
print(f"Retrying in {RETRY_TIMEOUT_REQUEST} sec")
time.sleep(RETRY_TIMEOUT_REQUEST)
# cryptocompare
def fetch_cryptocompare():
while True:
try:
response = requests.get('https://min-api.cryptocompare.com/data/histoday?fsym=FLO&tsym=USD&limit=1&aggregate=3&e=CCCAGG')
price = response.json()
prices['FLOUSD'] = price['Data'][-1]['close']
except ValueError:
print(f'Json parse error in cryptocompare, retrying in {RETRY_TIMEOUT_REQUEST} sec')
except:
print(f"Unable to fetch cryptocompare, retrying in {RETRY_TIMEOUT_REQUEST} sec")
else:
break # if data is accrued from coindesk, break from loop and procees to next process
# 3. update latest price data
def update_latest_prices():
while True:
try:
conn = sqlite3.connect('prices.db')
c = conn.cursor()
for pair in list(prices.items()):
pair = list(pair)
c.execute(f"UPDATE ratepairs SET price={pair[1]} WHERE ratepair='{pair[0]}'")
conn.commit()
except:
print(f"Unable to write to prices.db, retrying in {RETRY_TIMEOUT_DB} sec")
time.sleep(RETRY_TIMEOUT_DB)
else:
break
connect_database()
fetch_newprice()
fetch_bitpay_or_coindesk()
fetch_cryptocompare()
print('\n\n')
print(prices)
update_latest_prices()

1240
src/api/parsing.py Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,24 @@
document.addEventListener('DOMContentLoaded', function() {
var es = new EventSource('/sse');
es.onmessage = function (event) {
var messages_dom = document.getElementsByTagName('ul')[0];
var message_dom = document.createElement('li');
var content_dom = document.createTextNode('Received: ' + event.data);
message_dom.appendChild(content_dom);
messages_dom.appendChild(message_dom);
};
document.getElementById('send').onclick = function() {
fetch('/', {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
body: JSON.stringify ({
message: document.getElementsByName("message")[0].value,
}),
});
document.getElementsByName("message")[0].value = "";
};
});

View File

@ -0,0 +1,12 @@
<!doctype html>
<html>
<head>
<title>SSSE example</title>
</head>
<body>
<input name="message" type="text"></input>
<button id="send">Send</button>
<ul></ul>
<script type="text/javascript" src="{{ url_for('static', filename='broadcast.js') }}"></script>
</body>
</html>

4
src/api/wsgi.py Normal file
View File

@ -0,0 +1,4 @@
from api_main import app
if __name__ == "__main__":
app.run()

2793
src/backend/backend_main.py Normal file

File diff suppressed because it is too large Load Diff

43
src/backend/convert_db.py Normal file
View File

@ -0,0 +1,43 @@
from models import SystemData, ActiveTable, ConsumedTable, TransferLogs, TransactionHistory, RejectedTransactionHistory, Base, ContractStructure, ContractBase, ContractParticipants, SystemBase, ActiveContracts, ContractAddressMapping, LatestCacheBase, ContractTransactionHistory, RejectedContractTransactionHistory, TokenContractAssociation, ContinuosContractBase, ContractStructure1, ContractParticipants1, ContractDeposits1, ContractTransactionHistory1, LatestTransactions, LatestBlocks, DatabaseTypeMapping, TokenAddressMapping, LatestCacheBase1, LatestTransactions1, LatestBlocks1
import pdb
from sqlalchemy import create_engine, func
from sqlalchemy.orm import sessionmaker
def create_database_session_orm(type, parameters, base):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
return session
# connect to the database convert_db
convert_db = create_database_session_orm('system_dbs', {'db_name': 'convertdb'}, LatestCacheBase1)
latest_blocks = convert_db.query(LatestBlocks1).all()
latest_txs = convert_db.query(LatestTransactions1).all()
# create a new database convert_db_new
convert_db_1 = create_database_session_orm('system_dbs', {'db_name': 'latestCache'}, LatestCacheBase)
for block in latest_blocks:
convert_db_1.add(LatestBlocks(blockNumber=block.blockNumber, blockHash=block.blockHash, jsonData=block.jsonData))
for tx in latest_txs:
convert_db_1.add(LatestTransactions(transactionHash=tx.transactionHash, blockNumber=tx.blockNumber, jsonData=tx.jsonData, transactionType=tx.transactionType, parsedFloData=tx.parsedFloData))
convert_db_1.commit()
convert_db_1.close()
convert_db.close()

View File

@ -1,13 +1,14 @@
from sqlalchemy import Column, Integer, Float, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
TokenBase = declarative_base()
ContractBase = declarative_base()
ContinuosContractBase = declarative_base()
SystemBase = declarative_base()
LatestCacheBase = declarative_base()
class ActiveTable(Base):
class ActiveTable(TokenBase):
__tablename__ = "activeTable"
id = Column('id', Integer, primary_key=True)
@ -15,9 +16,12 @@ class ActiveTable(Base):
parentid = Column('parentid', Integer)
consumedpid = Column('consumedpid', String)
transferBalance = Column('transferBalance', Float)
addressBalance = Column('addressBalance', Float)
orphaned_parentid = Column('orphaned_parentid', Integer)
blockNumber = Column('blockNumber', Integer)
class ConsumedTable(Base):
class ConsumedTable(TokenBase):
__tablename__ = "consumedTable"
primaryKey = Column('primaryKey', Integer, primary_key=True)
@ -26,9 +30,12 @@ class ConsumedTable(Base):
parentid = Column('parentid', Integer)
consumedpid = Column('consumedpid', String)
transferBalance = Column('transferBalance', Float)
addressBalance = Column('addressBalance', Float)
orphaned_parentid = Column('orphaned_parentid', Integer)
blockNumber = Column('blockNumber', Integer)
class TransferLogs(Base):
class TransferLogs(TokenBase):
__tablename__ = "transferlogs"
primary_key = Column('id', Integer, primary_key=True)
@ -42,7 +49,7 @@ class TransferLogs(Base):
transactionHash = Column('transactionHash', String)
class TransactionHistory(Base):
class TransactionHistory(TokenBase):
__tablename__ = "transactionHistory"
primary_key = Column('id', Integer, primary_key=True)
@ -59,7 +66,7 @@ class TransactionHistory(Base):
parsedFloData = Column('parsedFloData', String)
class TokenContractAssociation(Base):
class TokenContractAssociation(TokenBase):
__tablename__ = "tokenContractAssociation"
primary_key = Column('id', Integer, primary_key=True)
@ -116,14 +123,82 @@ class ContractTransactionHistory(ContractBase):
parsedFloData = Column('parsedFloData', String)
class RejectedContractTransactionHistory(SystemBase):
__tablename__ = "rejectedContractTransactionHistory"
class ContractDeposits(ContractBase):
__tablename__ = "contractdeposits"
id = Column('id', Integer, primary_key=True)
depositorAddress = Column('depositorAddress', String)
depositAmount = Column('depositAmount', Float)
depositBalance = Column('depositBalance', Float)
expiryTime = Column('expiryTime', String)
unix_expiryTime = Column('unix_expiryTime', Integer)
status = Column('status', String)
transactionHash = Column('transactionHash', String)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
class ConsumedInfo(ContractBase):
__tablename__ = "consumedinfo"
id = Column('id', Integer, primary_key=True)
id_deposittable = Column('id_deposittable', Integer)
transactionHash = Column('transactionHash', String)
blockNumber = Column('blockNumber', Integer)
class ContractWinners(ContractBase):
__tablename__ = "contractwinners"
id = Column('id', Integer, primary_key=True)
participantAddress = Column('participantAddress', String)
winningAmount = Column('winningAmount', Float)
userChoice = Column('userChoice', String)
transactionHash = Column('transactionHash', String)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
referenceTxHash = Column('referenceTxHash', String)
class ContractStructure2(ContinuosContractBase):
__tablename__ = "contractstructure"
id = Column('id', Integer, primary_key=True)
attribute = Column('attribute', String)
index = Column('index', Integer)
value = Column('value', String)
class ContractParticipants2(ContinuosContractBase):
__tablename__ = "contractparticipants"
id = Column('id', Integer, primary_key=True)
participantAddress = Column('participantAddress', String)
tokenAmount = Column('tokenAmount', Float)
transactionHash = Column('transactionHash', String)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
class ContractDeposits2(ContinuosContractBase):
__tablename__ = "contractdeposits"
id = Column('id', Integer, primary_key=True)
depositorAddress = Column('depositorAddress', String)
depositAmount = Column('depositAmount', Float)
expiryTime = Column('expiryTime', String)
status = Column('status', String)
transactionHash = Column('transactionHash', String)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
class ContractTransactionHistory2(ContinuosContractBase):
__tablename__ = "contractTransactionHistory"
primary_key = Column('id', Integer, primary_key=True)
transactionType = Column('transactionType', String)
transactionSubType = Column('transactionSubType', String)
contractName = Column('contractName', String)
contractAddress = Column('contractAddress', String)
sourceFloAddress = Column('sourceFloAddress', String)
destFloAddress = Column('destFloAddress', String)
transferAmount = Column('transferAmount', Float)
@ -133,26 +208,6 @@ class RejectedContractTransactionHistory(SystemBase):
transactionHash = Column('transactionHash', String)
blockchainReference = Column('blockchainReference', String)
jsonData = Column('jsonData', String)
rejectComment = Column('rejectComment', String)
parsedFloData = Column('parsedFloData', String)
class RejectedTransactionHistory(SystemBase):
__tablename__ = "rejectedTransactionHistory"
primary_key = Column('id', Integer, primary_key=True)
tokenIdentification = Column('tokenIdentification', String)
sourceFloAddress = Column('sourceFloAddress', String)
destFloAddress = Column('destFloAddress', String)
transferAmount = Column('transferAmount', Float)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
time = Column('time', Integer)
transactionHash = Column('transactionHash', String)
blockchainReference = Column('blockchainReference', String)
jsonData = Column('jsonData', String)
rejectComment = Column('rejectComment', String)
transactionType = Column('transactionType', String)
parsedFloData = Column('parsedFloData', String)
@ -206,19 +261,89 @@ class TokenAddressMapping(SystemBase):
blockHash = Column('blockHash', String)
class LatestTransactions(LatestCacheBase):
__tablename__ = "latestTransactions"
class DatabaseTypeMapping(SystemBase):
__tablename__ = "databaseTypeMapping"
id = Column('id', Integer, primary_key=True)
db_name = Column('db_name', String)
db_type = Column('db_type', String)
keyword = Column('keyword', String)
object_format = Column ('object_format', String)
blockNumber = Column('blockNumber', Integer)
class TimeActions(SystemBase):
__tablename__ = "time_actions"
id = Column('id', Integer, primary_key=True)
time = Column('time', String)
activity = Column('activity', String)
status = Column('status', String)
contractName = Column('contractName', String)
contractAddress = Column('contractAddress', String)
contractType = Column('contractType', String)
tokens_db = Column('tokens_db', String)
parsed_data = Column('parsed_data', String)
transactionHash = Column('transactionHash', String)
blockNumber = Column('blockNumber', String)
blockNumber = Column('blockNumber', Integer)
class RejectedContractTransactionHistory(SystemBase):
__tablename__ = "rejectedContractTransactionHistory"
primary_key = Column('id', Integer, primary_key=True)
transactionType = Column('transactionType', String)
transactionSubType = Column('transactionSubType', String)
contractName = Column('contractName', String)
contractAddress = Column('contractAddress', String)
sourceFloAddress = Column('sourceFloAddress', String)
destFloAddress = Column('destFloAddress', String)
transferAmount = Column('transferAmount', Float)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
time = Column('time', Integer)
transactionHash = Column('transactionHash', String)
blockchainReference = Column('blockchainReference', String)
jsonData = Column('jsonData', String)
rejectComment = Column('rejectComment', String)
parsedFloData = Column('parsedFloData', String)
class RejectedTransactionHistory(SystemBase):
__tablename__ = "rejectedTransactionHistory"
primary_key = Column('id', Integer, primary_key=True)
tokenIdentification = Column('tokenIdentification', String)
sourceFloAddress = Column('sourceFloAddress', String)
destFloAddress = Column('destFloAddress', String)
transferAmount = Column('transferAmount', Float)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
time = Column('time', Integer)
transactionHash = Column('transactionHash', String)
blockchainReference = Column('blockchainReference', String)
jsonData = Column('jsonData', String)
rejectComment = Column('rejectComment', String)
transactionType = Column('transactionType', String)
parsedFloData = Column('parsedFloData', String)
class LatestTransactions(LatestCacheBase):
__tablename__ = "latestTransactions"
id = Column('id', Integer, primary_key=True)
transactionHash = Column('transactionHash', String)
blockNumber = Column('blockNumber', Integer)
jsonData = Column('jsonData', String)
transactionType = Column('transactionType', String)
parsedFloData = Column('parsedFloData', String)
db_reference = Column('db_reference', String)
class LatestBlocks(LatestCacheBase):
__tablename__ = "latestBlocks"
id = Column('id', Integer, primary_key=True)
blockNumber = Column('blockNumber', String)
blockNumber = Column('blockNumber', Integer)
blockHash = Column('blockHash', String)
jsonData = Column('jsonData', String)

View File

@ -0,0 +1,281 @@
"""
DEFINITIONS:
Special character words - A word followed by either of the special character(#,*,@)
#-word - Token name
@-word - Smart Contract name
*-word - Smart Contract type
"""
"""
FIND RULES
1. Identify all Special character words in a text string >> and output as a list of those words
2. Apply rule 1, but only before a marker or keyword like ":" and output as a list of those words
3. Find a number in the string
5. Check for an occurance of exact order of pattern of special character words
eg. for one-time-event smart contract( identified using *-word), the existence of #-word should be checked before the ':' and output the #-word
for continuos-event smart contract( identified using *-word)(with subtype tokenswap), the #-words should be checked after the ':' and output two hash words
6. Given a string of the type contract conditions, format and output an object string by removing = and by removing number references
7. Idenitfy all the special character words in a text string such that spaces are not taken into account, for eg. Input string => "contract-conditions :(2) accepting_token=rupee#(3) selling_token = bioscope# " |
Output string => ["rupee#","bioscope#"]
"""
def findrule1(rawstring, special_character):
wordList = []
for word in rawstring.split(' '):
if word.endswith(special_character) and len(word) != 1:
wordList.append(word)
return wordList
def findrule3(text):
base_units = {'thousand': 10 ** 3, 'million': 10 ** 6, 'billion': 10 ** 9, 'trillion': 10 ** 12}
textList = text.split(' ')
counter = 0
value = None
for idx, word in enumerate(textList):
try:
result = float(word)
if textList[idx + 1] in base_units:
value = result * base_units[textList[idx + 1]]
counter = counter + 1
else:
value = result
counter = counter + 1
except:
for unit in base_units:
result = word.split(unit)
if len(result) == 2 and result[1] == '' and result[0] != '':
try:
value = float(result[0]) * base_units[unit]
counter = counter + 1
except:
continue
if counter == 1:
return value
else:
return None
"""
TRUE-FALSE RULES
1. Check if subtype = tokenswap exists in a given string,
2. Find if any one of special word in list is present, ie. [start, create, incorporate] and any of the words in second list is not present like [send,transfer, give]
"""
import re
def findWholeWord(w):
return re.compile(r'\b({0})\b'.format(w), flags=re.IGNORECASE).search
'''
findWholeWord('seek')('those who seek shall find') # -> <match object>
findWholeWord('word')('swordsmith')
'''
def truefalse_rule1(rawstring, string_tobe_checked):
nowhites_rawstring = rawstring.replace(" ","").lower()
if string_tobe_checked.replace(" ","").lower() in nowhites_rawstring:
return True
else:
return False
denied_list = ['transfer', 'send', 'give'] # keep everything lowercase
permitted_list = ['incorporate', 'create', 'start'] # keep everything lowercase
def truefalse_rule2(rawstring, permitted_list, denied_list):
# Find transfer , send , give
foundPermitted = None
foundDenied = None
for word in permitted_list:
if findWholeWord(word)(rawstring):
foundPermitted = word
break
for word in denied_list:
if findWholeWord(word)(rawstring):
foundDenied = word
break
if (foundPermitted is not None) and (foundDenied is None):
return True
else:
return False
def selectCateogry(rawstring, wordlist, category1, category2):
None
"""
CLASSIFY RULES
1. Based on various combinations of the special character words and special words, create categorizations
eg. 1.1 if there is only one #-word, then the flodata is related purely to token system
1.2 if there is one #-word, one @-word .. then it is related to smart contract system, but cannot be a creation type since smart contract creaton needs to specify contract type with *-word
1.3 if there is one
2. Check if it is of the value 'one-time-event' or 'continuos-event'
"""
"""
REJECT RULES
1. *-words have to be equal to 1 ie. You can specify only one contract type at once , otherwise noise
2. *-word has to fall in the following type ['one-time-event*', 'continuous-event*'], otherwise noise
3. @-word should exist only before the : , otherwise noise
4. There should be only one @-word, otherwise noise
5. for one-time-event smart contract( identified using one-time-event*), if there is a no #-word before : -> reject as noise
6. for one-time-event smart contract( identified using one-time-event*) if there is more than one #-word before : -> reject as noise
7. for one-time-event smart contract( identified using one-time-event*) if there is/are #-word(s) after colon -> reject as noise
8. for continuos-event smart contract( identified using continuos-event*) if there is one or more #-word before : > reject as noise
9. for continuos-event smart contract( identified using continuos-event*)( with subtype token-swap ) if there is one or more than two #-word after : > reject as noise
10.
"""
def rejectrule9(rawtext, starword):
pass
extractContractConditions(cleanstring, contracttype, blocktime=blockinfo['time'], marker=hashList[0][:-1])
# Token incorporation operation
## Existance of keyword
"""
APPLY RULES
1. After application of apply rule1, a parser rule will either return a value or will classify the result as noise
"""
def apply_rule1(*argv):
a = argv[0](*argv[1:])
if a is False:
return "noise"
elif a is True:
return a
# If any of the parser rule returns a value, then queue it for further processing, otherwise send noise to the output engine
apply_rule1(findrule_1, rawstring, special_character)
def outputreturn(*argv):
if argv[0] == 'noise':
parsed_data = {'type': 'noise'}
elif argv[0] == 'token_incorporation':
parsed_data = {
'type': 'tokenIncorporation',
'flodata': argv[1], #string
'tokenIdentification': argv[2], #hashList[0][:-1]
'tokenAmount': argv[3] #initTokens
}
elif argv[0] == 'token_transfer':
parsed_data = {
'type': 'transfer',
'transferType': 'token',
'flodata': argv[1], #string
'tokenIdentification': argv[2], #hashList[0][:-1]
'tokenAmount': argv[3] #amount
}
elif argv[0] == 'one-time-event-userchoice-smartcontract-incorporation':
parsed_data = {
'type': 'smartContractIncorporation',
'contractType': 'one-time-event',
'tokenIdentification': argv[1], #hashList[0][:-1]
'contractName': argv[2], #atList[0][:-1]
'contractAddress': argv[3], #contractaddress[:-1]
'flodata': argv[4], #string
'contractConditions': {
'contractamount' : argv[5],
'minimumsubscriptionamount' : argv[6],
'maximumsubscriptionamount' : argv[7],
'payeeaddress' : argv[8],
'userchoice' : argv[9],
'expiryTime' : argv[10]
}
}
elif argv[0] == 'one-time-event-userchoice-smartcontract-participation':
parsed_data = {
'type': 'transfer',
'transferType': 'smartContract',
'flodata': argv[1], #string
'tokenIdentification': argv[2], #hashList[0][:-1]
'operation': 'transfer',
'tokenAmount': argv[3], #amount
'contractName': argv[4], #atList[0][:-1]
'userChoice': argv[5] #userChoice
}
elif argv[0] == 'one-time-event-userchoice-smartcontract-trigger':
parsed_data = {
'type': 'smartContractPays',
'contractName': argv[1], #atList[0][:-1]
'triggerCondition': argv[2] #triggerCondition.group().strip()[1:-1]
}
elif argv[0] == 'one-time-event-time-smartcontract-incorporation':
parsed_data = {
'type': 'smartContractIncorporation',
'contractType': 'one-time-event',
'tokenIdentification': argv[1], #hashList[0][:-1]
'contractName': argv[2], #atList[0][:-1]
'contractAddress': argv[3], #contractaddress[:-1]
'flodata': argv[4], #string
'contractConditions': {
'contractamount' : argv[5],
'minimumsubscriptionamount' : argv[6],
'maximumsubscriptionamount' : argv[7],
'payeeaddress' : argv[8],
'expiryTime' : argv[9]
}
}
elif argv[0] == 'one-time-event-time-smartcontract-participation':
parsed_data = {
'type': 'transfer',
'transferType': 'smartContract',
'flodata': argv[1], #string
'tokenIdentification': argv[2], #hashList[0][:-1]
'operation': 'transfer',
'tokenAmount': argv[3], #amount
'contractName': argv[4] #atList[0][:-1]
}
elif argv[0] == 'continuos-event-token-swap-incorporation':
parsed_data = {
'type': 'smartContractIncorporation',
'contractType': 'continuos-event',
'tokenIdentification': argv[1], #hashList[0][:-1]
'contractName': argv[2], #atList[0][:-1]
'contractAddress': argv[3], #contractaddress[:-1]
'flodata': argv[4], #string
'contractConditions': {
'subtype' : argv[5], #tokenswap
'accepting_token' : argv[6],
'selling_token' : argv[7],
'pricetype' : argv[8],
'price' : argv[9],
}
}
elif argv[0] == 'continuos-event-token-swap-deposit':
parsed_data = {
'type': 'smartContractDeposit',
'tokenIdentification': argv[1], #hashList[0][:-1]
'depositAmount': argv[2], #depositAmount
'contractName': argv[3], #atList[0][:-1]
'flodata': argv[4], #string
'depositConditions': {
'expiryTime' : argv[5]
}
}
elif argv[0] == 'continuos-event-token-swap-participation':
parsed_data = {
'type': 'smartContractParticipation',
'tokenIdentification': argv[1], #hashList[0][:-1]
'sendAmount': argv[2], #sendtAmount
'receiveAmount': argv[3], #receiveAmount
'contractName': argv[4], #atList[0][:-1]
'flodata': argv[5] #string
}

1325
src/backend/parsing.py Normal file

File diff suppressed because it is too large Load Diff

296
src/backend/planning.txt Normal file
View File

@ -0,0 +1,296 @@
'''
TEMPLATE FOR SECOND STAGE AFTER INPUT CLASSIFIER
IF BLOCK If the output of input classifier is tokensystem-C,
JUST LINEARLY START BUILDING IT
then first start building the known outputs
// outputreturn('token_incorporation',f"{flodata}", f"{tokenname}", f"{tokenamount}")
f"{flodata} = rawstring
f"{tokenname}" = wordlist entry
tokensystem-C-resolved = Output of second stage classification
f"{tokenamount}" = find_number_function
'''
'''
The problem we are facing:
* Token transactions don't have * or @ symbols
* Smart Contract transactions have * , @ , # symbols
* Smart Contract transaction of the type one time event have 1 # before colon
* Smart Contract transaction of the type continuous event has 2 # after colon
* So we are checking for hashes based on the type of smart contract(identified by *)
* But the above check disregards checking hashes in token transactions
'''
# Write down all the possible flodata( with all combinations possible) for
'''
Token creation
create 500 million rmt#
['#']
Token transfer
transfer 200 rmt#
['#']
One time event userchoice creation
Create Smart Contract with the name India-elections-2019@ of the type one-time-event* using the asset rmt# at the address F7osBpjDDV1mSSnMNrLudEQQ3cwDJ2dPR1$ with contract-conditions: (1) contractAmount=0.001rmt (2) userChoices=Narendra Modi wins| Narendra Modi loses (3) expiryTime= Wed May 22 2019 21:00:00 GMT+0530
['@','*','#','$',':']
['@','*','#','$',':','#']
One time event userchoice participation
send 0.001 rmt# to india-elections-2019@ to FLO address F7osBpjDDV1mSSnMNrLudEQQ3cwDJ2dPR1 with the userchoice:'narendra modi wins'
['#','@',':']
['#','@','$',':']
One time event userchoice trigger
india-elections-2019@ winning-choice:'narendra modi wins'
['@',':']
One time event timeevent creation
Create Smart Contract with the name India-elections-2019@ of the type one-time-event* using the asset rmt# at the address F7osBpjDDV1mSSnMNrLudEQQ3cwDJ2dPR1$ with contract-conditions: (1) contractAmount=0.001rmt (2) expiryTime= Wed May 22 2019 21:00:00 GMT+0530
['@','*','#','$',':']
['@','*','#','$',':','#']
One time event timeevent participation
send 0.001 rmt# to india-elections-2019@ to FLO address F7osBpjDDV1mSSnMNrLudEQQ3cwDJ2dPR1
['#','@']
['#','@','$']
Continuos event token swap creation
Create Smart Contract with the name swap-rupee-bioscope@ of the type continuous-event* at the address oRRCHWouTpMSPuL6yZRwFCuh87ZhuHoL78$ with contract-conditions :
(1) subtype = tokenswap
(2) accepting_token = rupee#
(3) selling_token = bioscope#
(4) price = '15'
(5) priceType = predetermined
(6) direction = oneway
['@','*','$',':','#','#']
Continuos event tokenswap deposit
Deposit 15 bioscope# to swap-rupee-bioscope@ its FLO address being oRRCHWouTpMSPuL6yZRwFCuh87ZhuHoL78$ with deposit-conditions: (1) expiryTime= Wed Nov 17 2021 21:00:00 GMT+0530
['#','@',':']
['#','@','$',':']
Continuos event tokenswap participation
Send 15 rupee# to swap-rupee-article@ its FLO address being FJXw6QGVVaZVvqpyF422Aj4FWQ6jm8p2dL$
['#','@']
['#','@','$']
'''
'''
['#'] - Token creation
['#'] - Token particiation
['@','*','#','$',':'] - Smart contract creation user-choice
['@','*','#','$',':','#']
['#','@',':'] - Smart contract participation user-choice
['#','@','$',':']
['@',':'] - Smart contract trigger user-choice
['@','*','#','$',':'] - Smart contract creation - ote-timebased
['@','*','#','$',':','#']
['#','@'] - Smart contract particiation - ote-timebased
['#','@','$']
['@','*','$',':','#','#'] - Smart contract creation - continuos event - tokenswap
['#','@',':'] - Smart contract deposit - continuos event - tokenswap
['#','@','$',':']
['#','@'] - Smart contract participation - continuos event - tokenswap
['#','@','$'] - Smart contract participation - continuos event - tokenswap
'''
'''
['#'] - Token creation
['#'] - Token particiation
['@','*','#','$',':'] - Smart contract creation ote-userchoice
['@','*','#','$',':','#']
['@','*','#','$',':'] - Smart contract creation - ote-timebased
['@','*','#','$',':','#']
['#','@',':'] - Smart contract participation user-choice
['#','@','$',':']
['#','@',':'] - Smart contract deposit - continuos event - tokenswap
['#','@','$',':']
['@',':'] - Smart contract trigger user-choice
['#','@'] - Smart contract particiation - ote-timebased
['#','@','$']
['#','@'] - Smart contract participation - continuos event - tokenswap
['#','@','$'] - Smart contract participation - continuos event - tokenswap
['@','*','$',':','#','#'] - Smart contract creation - continuos event - tokenswap
'''
'''
Conflicts -
1. Token creation | Token participation
2. Smart contract CREATION of the type one-time-event-userchoice | one-time-event-timebased
3. Smart contract PARTICIPATION user-choice | Smart contract DEPOSIT continuos-event token-swap
4. Smart contract PARTICIPATION one-time-event-timebased | Smart contract participation - continuos event - tokenswap
'''
'''
Emerging parser design
Phase 1 - Input processing | Special character position based classification and noise detection (FINISHED)
Phase 2 - Conflict recognition (FINISHED)
Phase 3 - Category based keyword checks
Phase 4 - Parser rules for finding data
Phase 5 - Rules for applying parser rules
Phase 6 - Category based data field extraction
Phase 7 - Output formatting and return (FINISHED)
'''
'''
Allowed formats of Smart Contract and token names
1. First character should always be an Alphabet, lower case or upper case
2. The last character should always be an Alphabet, lower case or upper case
3. The middle characters can be a - or _
Check for FLO Address
Write checks for conditions inside contract conditions
Serious error handling for contract-conditions
* 2222:00 gives error
* contractAmount = 0.022rt gives error | check if space is allowed between 0.022 rt
'''
'''
What we need for NFT contract code
1. NFT-address mapping table in system.db
2. New main transaction category class
3. New sub-category for transfer category class ie. NFT transfer
NFT Smart Contract end cases
1. NFT against an address
2. NFT against another NFT
3.
flodata format for NFT
Create 1000 NFT with bioscope# with nft-details: (1) name = 'bioscope' (2) hash =
Create 100 albumname# as NFT with 2CF24DBA5FB0A30E26E83B2AC5B9E29E1B161E5C1FA7425E73043362938B9824 as asset hash
[#]
Rules
-----
DIFFERENT BETWEEN TOKEN AND NFT
System.db will have a differnent entry
in creation nft word will be extra
NFT Hash must be present
Creation and transfer amount .. only integer parts will be taken
Keyword nft must be present in both creation and transfer
'''
'''
Need infinite tokens to create stable coins, so they can be created without worrying about the upper limit of the coins
'''
'''
Create another table in system.db, it simply writes what is every database in one place
Database_name Database type
'''
'''
IDEA FOR NEW ROLLBACK SYSTEM - 24 Jan 2022
-------------------------------------------
245436
[
tx1 - rmt - 245436 - send 10 rmt#
tx2 - rmt - 245436 - send 4 rmt#
tx3 - rmt - 245436 - send 1 rmt#
tx4 - rmt - 245436 - send 100 rmt#
tx5 - rmt trigger(5) - 245436 - trigger
]
banana - txhash
orange - entries in activepid table
mangoes - entries in transaction history table
CURRENT SYSTEM
given a block , find out all the oranges in the block
given a block, find out all the bananas in the block and
for each banana, find corresponding databases( found through parsing of banana flodata and banana txdata)
- if token database then rollback, if contractDatabase then delete entry
NEW SYSTEM
give a block , find out all the oranges in the block
given a block, find out all the bananas in the block and their corresponding databases( found through parsing of banana flodata and banana txdata)
- start opening all those databases one by one | if token database then rollback, if contractDatabase then delete entry
send transaction -> receive the databases associated with it
'''
'''
Step 1
The block that we are rolling back into is earlier than the database creation blockNumber, then delete the whole database without rolling back. Do this for both token databases and smart contract databases
Step 2
If the rolling back block is later than database creation blockNumber, then invoke rollback a database function( rollback_database )
Step 3
Create a list of databases to be opened, and creation date (creation date is block number). This will exclude the token and smart contract databases which are already deleted
Step 4
For each of the database to be opened, rollback the database to rollback point
rollback_database will take 2 inputs, a block number to which it has to rollback to and the name of the database
Step 5
Create a delete function, which will delete from transactionHistory, latestCache and contractDatabase
To-do
------
* Integrate all the functions in the following order:
1 , 2 , 3 , 4 , 5 | That will finish the operation of taking the block number as input and the roll back function will rollback upto the block number specified for all kinds of databases and all kinds of transactions
'''

View File

@ -0,0 +1,29 @@
DATABASES
* Database operations have to be optimized
- in terms of not repeating too often
- Save changes only when all business logic is approved, since we are working with multiple databases currently
* Too much of repitition in database operations right now
* Database model classes, for SQL alchemy, have to be optimized ie. base classes for tokenswap and one-time-event totally different right now
* Make all database operations to follow SQLAlchemy, no direct SQL commands
* Remove all position based queries
PROGRAM STRUCTURE
* Optimize overall program structure
NEW FEATURES
* Rollback feature
* When processing blocks from the websocket API, check the blockheight of the new block vs the latest block in the database | this is to make sure none of the transactions go missing
-----
processBlocks
* find the last scanned block in the database
* find the latest block at the API
* for loop for lastscannedblock to latestblock
* processEach transaction based on business logic
* Update system.db to reflect currently scanned block as the latest block
* Check for local smart contract triggers
* Check if any token swap contract deposits have to be returned

View File

@ -0,0 +1,87 @@
import requests
from operator import attrgetter
import json
import pdb
'''
USD-INR
https://api.exchangerate-api.com/v4/latest/usd
Parsed stateF
"stateF":{
"bitcoin_price_source":"bitpay",
"usd_inr_exchange_source":"bitpay"
}
'''
'''
stateF notes for amount split on contracts
stateF_object = {
"floaddresses": "oPkHWcvqBHfCortTHScrVBjXLsZhWie99C-oPkHWcvqBHfCortTHScrVBjXLsZhWie99C-oPkHWcvqBHfCortTHScrVBjXLsZhWie99C",
"splits": "10-20-30",
}
'''
# stateF
stateF_address = 'oPkHWcvqBHfCortTHScrVBjXLsZhWie99C'
stateF_object = {
"bitcoin_price_source":"bitpay",
"usd_inr_exchange_source":"bitpay"
}
# Flodata object
flodata_object = {
"bitpay": {
"bitcoin_price_source":{
"api" : "https://bitpay.com/api/rates",
"path" : [2,"rate"],
"data_type" : "float"
},
"usd_inr_exchange_source":{
"api" : "https://api.exchangerate-api.com/v4/latest/usd",
"path" : ["rates","INR"],
"data_type" : "float"
}
}
}
def pull_stateF(floID):
response = requests.get(f"https://flosight-testnet.ranchimall.net/api/txs/?address={floID}")
if response.status_code == 200:
address_details = response.json()
latest_stateF = address_details['txs'][0]['floData']
latest_stateF = json.loads(latest_stateF)
return latest_stateF['stateF']
else:
print('API response not valid')
def query_api(api_object):
api, path, data_type = api_object.values()
response = requests.get(api)
if response.status_code == 200:
# Use path keys to reach the value
api_response = response.json()
for key in path:
api_response = api_response[key]
# todo: how to use datatype to convert
if data_type == 'float':
value_at_path = float(api_response)
return value_at_path
else:
print('API response not valid')
def process_stateF(stateF_object, stateF_address):
flodata_object = pull_stateF(stateF_address)
processed_values = {}
for key, value in stateF_object.items():
external_value = query_api(flodata_object[value][key])
processed_values[key] = external_value
return processed_values
if __name__ == '__main__':
processed_statef = process_stateF(stateF_object, stateF_address)
print(processed_statef)

View File

@ -0,0 +1,91 @@
import argparse
import configparser
import json
import logging
import os
import shutil
import sys
import pyflo
import requests
import socketio
from sqlalchemy import create_engine, func
from sqlalchemy.orm import sessionmaker
import time
import arrow
import parsing
from datetime import datetime
from ast import literal_eval
from models import SystemData, TokenBase, ActiveTable, ConsumedTable, TransferLogs, TransactionHistory, TokenContractAssociation, RejectedTransactionHistory, ContractBase, ContractStructure, ContractParticipants, ContractTransactionHistory, ContractDeposits, ConsumedInfo, ContractWinners, ContinuosContractBase, ContractStructure2, ContractParticipants2, ContractDeposits2, ContractTransactionHistory2, SystemBase, ActiveContracts, SystemData, ContractAddressMapping, TokenAddressMapping, DatabaseTypeMapping, TimeActions, RejectedContractTransactionHistory, RejectedTransactionHistory, LatestCacheBase, LatestTransactions, LatestBlocks
from statef_processing import process_stateF
# Configuration of required variables
config = configparser.ConfigParser()
config.read('config.ini')
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
file_handler = logging.FileHandler(os.path.join(config['DEFAULT']['DATA_PATH'],'tracking.log'))
file_handler.setLevel(logging.INFO)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
def create_database_connection(type, parameters):
if type == 'token':
path = os.path.join(config['DEFAULT']['DATA_PATH'], 'tokens', f"{parameters['token_name']}.db")
engine = create_engine(f"sqlite:///{path}", echo=True)
elif type == 'smart_contract':
path = os.path.join(config['DEFAULT']['DATA_PATH'], 'smartContracts', f"{parameters['contract_name']}-{parameters['contract_address']}.db")
engine = create_engine(f"sqlite:///{path}", echo=True)
elif type == 'system_dbs':
path = os.path.join(config['DEFAULT']['DATA_PATH'], f"system.db")
engine = create_engine(f"sqlite:///{path}", echo=False)
elif type == 'latest_cache':
path = os.path.join(config['DEFAULT']['DATA_PATH'], f"latestCache.db")
engine = create_engine(f"sqlite:///{path}", echo=False)
connection = engine.connect()
return connection
def create_database_session_orm(type, parameters, base):
if type == 'token':
path = os.path.join(config['DEFAULT']['DATA_PATH'], 'tokens', f"{parameters['token_name']}.db")
engine = create_engine(f"sqlite:///{path}", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'smart_contract':
path = os.path.join(config['DEFAULT']['DATA_PATH'], 'smartContracts', f"{parameters['contract_name']}-{parameters['contract_address']}.db")
engine = create_engine(f"sqlite:///{path}", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'system_dbs':
path = os.path.join(config['DEFAULT']['DATA_PATH'], f"{parameters['db_name']}.db")
engine = create_engine(f"sqlite:///{path}", echo=False)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
return session
# Connect to system.db with a session
'''session = create_database_session_orm('system_dbs', {'db_name':'system1'}, SystemBase)
subquery_filter = session.query(TimeActions.id).group_by(TimeActions.transactionHash).having(func.count(TimeActions.transactionHash)==1).subquery()
contract_deposits = session.query(TimeActions).filter(TimeActions.id.in_(subquery_filter), TimeActions.status=='active', TimeActions.activity=='contract-deposit').all()
for contract in contract_deposits:
print(contract.transactionHash)'''
systemdb_session = create_database_session_orm('system_dbs', {'db_name':'system'}, SystemBase)
query = systemdb_session.query(TokenAddressMapping).filter(TokenAddressMapping.tokenAddress == 'contractAddress')
results = query.all()
print('Lets investigate this now')

238
src/backend/util_rebuild.py Normal file
View File

@ -0,0 +1,238 @@
from sqlalchemy import create_engine, desc, func
from sqlalchemy.orm import sessionmaker
from models import SystemData, TokenBase, ActiveTable, ConsumedTable, TransferLogs, TransactionHistory, TokenContractAssociation, ContractBase, ContractStructure, ContractParticipants, ContractTransactionHistory, ContractDeposits, ConsumedInfo, ContractWinners, ContinuosContractBase, ContractStructure2, ContractParticipants2, ContractDeposits2, ContractTransactionHistory2, SystemBase, ActiveContracts, SystemData, ContractAddressMapping, TokenAddressMapping, DatabaseTypeMapping, TimeActions, RejectedContractTransactionHistory, RejectedTransactionHistory, LatestCacheBase, LatestTransactions, LatestBlocks
import json
from backend_main import processTransaction, checkLocal_expiry_trigger_deposit, newMultiRequest
import os
import logging
import argparse
import configparser
import shutil
import sys
import pdb
# helper functions
def check_database_existence(type, parameters):
if type == 'token':
return os.path.isfile(f"./tokens/{parameters['token_name']}.db")
if type == 'smart_contract':
return os.path.isfile(f"./smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db")
def create_database_connection(type, parameters):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
connection = engine.connect()
return connection
def create_database_session_orm(type, parameters, base):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
return session
# MAIN EXECUTION STARTS
# Configuration of required variables
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
file_handler = logging.FileHandler('tracking.log')
file_handler.setLevel(logging.INFO)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
# Rule 1 - Read command line arguments to reset the databases as blank
# Rule 2 - Read config to set testnet/mainnet
# Rule 3 - Set flo blockexplorer location depending on testnet or mainnet
# Rule 4 - Set the local flo-cli path depending on testnet or mainnet ( removed this feature | Flosights are the only source )
# Rule 5 - Set the block number to scan from
# Read command line arguments
parser = argparse.ArgumentParser(description='Script tracks RMT using FLO data on the FLO blockchain - https://flo.cash')
parser.add_argument('-rb', '--toblocknumer', nargs='?', type=int, help='Forward to the specified block number')
parser.add_argument('-r', '--blockcount', nargs='?', type=int, help='Forward to the specified block count')
args = parser.parse_args()
if (args.blockcount and args.toblocknumber):
print("You can only specify one of the options -b or -c")
sys.exit(0)
elif args.blockcount:
forward_block = lastscannedblock + args.blockcount
elif args.toblocknumer:
forward_block = args.toblocknumer
else:
latestCache_session = create_database_session_orm('system_dbs', {'db_name':'latestCache'}, LatestCacheBase)
forward_block = int(latestCache_session.query(LatestBlocks.blockNumber).order_by(LatestBlocks.blockNumber.desc()).first()[0])
latestCache_session.close()
args = parser.parse_args()
apppath = os.path.dirname(os.path.realpath(__file__))
dirpath = os.path.join(apppath, 'tokens')
if not os.path.isdir(dirpath):
os.mkdir(dirpath)
dirpath = os.path.join(apppath, 'smartContracts')
if not os.path.isdir(dirpath):
os.mkdir(dirpath)
# rename all the old databases
# system.db , latestCache.db, smartContracts, tokens
if os.path.isfile('./system.db'):
os.rename('system.db', 'system1.db')
if os.path.isfile('./latestCache.db'):
os.rename('latestCache.db', 'latestCache1.db')
if os.path.isfile('./smartContracts'):
os.rename('smartContracts', 'smartContracts1')
if os.path.isfile('./tokens'):
os.rename('tokens', 'tokens1')
# Read configuration
config = configparser.ConfigParser()
config.read('config.ini')
# todo - write all assertions to make sure default configs are right
if (config['DEFAULT']['NET'] != 'mainnet') and (config['DEFAULT']['NET'] != 'testnet'):
logger.error("NET parameter in config.ini invalid. Options are either 'mainnet' or 'testnet'. Script is exiting now")
sys.exit(0)
# Specify mainnet and testnet server list for API calls and websocket calls
serverlist = None
if config['DEFAULT']['NET'] == 'mainnet':
serverlist = config['DEFAULT']['MAINNET_BLOCKBOOK_SERVER_LIST']
elif config['DEFAULT']['NET'] == 'testnet':
serverlist = config['DEFAULT']['TESTNET_BLOCKBOOK_SERVER_LIST']
serverlist = serverlist.split(',')
neturl = config['DEFAULT']['BLOCKBOOK_NETURL']
tokenapi_sse_url = config['DEFAULT']['TOKENAPI_SSE_URL']
# Delete database and smartcontract directory if reset is set to 1
#if args.reset == 1:
logger.info("Resetting the database. ")
apppath = os.path.dirname(os.path.realpath(__file__))
dirpath = os.path.join(apppath, 'tokens')
shutil.rmtree(dirpath)
os.mkdir(dirpath)
dirpath = os.path.join(apppath, 'smartContracts')
shutil.rmtree(dirpath)
os.mkdir(dirpath)
dirpath = os.path.join(apppath, 'system.db')
if os.path.exists(dirpath):
os.remove(dirpath)
dirpath = os.path.join(apppath, 'latestCache.db')
if os.path.exists(dirpath):
os.remove(dirpath)
# Read start block no
startblock = int(config['DEFAULT']['START_BLOCK'])
session = create_database_session_orm('system_dbs', {'db_name': "system"}, SystemBase)
session.add(SystemData(attribute='lastblockscanned', value=startblock - 1))
session.commit()
session.close()
# Initialize latest cache DB
session = create_database_session_orm('system_dbs', {'db_name': "latestCache"}, LatestCacheBase)
session.commit()
session.close()
# get all blocks and transaction data
latestCache_session = create_database_session_orm('system_dbs', {'db_name':'latestCache1'}, LatestCacheBase)
if forward_block:
lblocks = latestCache_session.query(LatestBlocks).filter(LatestBlocks.blockNumber <= forward_block).all()
ltransactions = latestCache_session.query(LatestTransactions).filter(LatestTransactions.blockNumber <= forward_block).all()
else:
lblocks = latestCache_session.query(LatestBlocks).all()
ltransactions = latestCache_session.query(LatestTransactions).all()
latestCache_session.close()
# make a list of all internal tx block numbers
systemDb_session = create_database_session_orm('system_dbs', {'db_name':'system1'}, SystemBase)
internal_action_blocks = systemDb_session.query(ActiveContracts.blockNumber).all()
internal_action_blocks = [block[0] for block in internal_action_blocks]
internal_action_blocks = sorted(internal_action_blocks)
lblocks_dict = {}
for block in lblocks:
block_dict = block.__dict__
print(block_dict['blockNumber'])
lblocks_dict[block_dict['blockNumber']] = {'blockHash':f"{block_dict['blockHash']}", 'jsonData':f"{block_dict['jsonData']}"}
# process and rebuild all transactions
prev_block = 0
for transaction in ltransactions:
transaction_dict = transaction.__dict__
current_block = transaction_dict['blockNumber']
# Check if any internal action block lies between prev_block and current_block
for internal_block in internal_action_blocks:
if prev_block < internal_block <= current_block:
logger.info(f'Processing block {internal_block}')
# Get block details
response = newMultiRequest(f"block-index/{internal_block}")
blockhash = response['blockHash']
blockinfo = newMultiRequest(f"block/{blockhash}")
# Call your function here, passing the internal block to it
checkLocal_expiry_trigger_deposit(blockinfo)
transaction_data = json.loads(transaction_dict['jsonData'])
parsed_flodata = json.loads(transaction_dict['parsedFloData'])
try:
block_info = json.loads(lblocks_dict[transaction_dict['blockNumber']]['jsonData'])
processTransaction(transaction_data, parsed_flodata, block_info)
prev_block = current_block
except:
prev_block = current_block
continue
# copy the old block data
old_latest_cache = create_database_connection('system_dbs', {'db_name':'latestCache1'})
old_latest_cache.execute("ATTACH DATABASE 'latestCache.db' AS new_db")
old_latest_cache.execute("INSERT INTO new_db.latestBlocks SELECT * FROM latestBlocks WHERE blockNumber <= ?", (forward_block,))
old_latest_cache.close()
# delete
# system.db , latestCache.db, smartContracts, tokens
if os.path.isfile('./system1.db'):
os.remove('system1.db')
if os.path.isfile('./latestCache1.db'):
os.remove('latestCache1.db')
if os.path.isfile('./smartContracts1'):
shutil.rmtree('smartContracts1')
if os.path.isfile('./tokens1'):
shutil.rmtree('tokens1')
# Update system.db's last scanned block
connection = create_database_connection('system_dbs', {'db_name': "system"})
connection.execute(f"UPDATE systemData SET value = {int(list(lblocks_dict.keys())[-1])} WHERE attribute = 'lastblockscanned';")
connection.close()

View File

@ -0,0 +1,247 @@
from sqlalchemy import create_engine, desc, func
from sqlalchemy.orm import sessionmaker
from models import SystemData, TokenBase, ActiveTable, ConsumedTable, TransferLogs, TransactionHistory, TokenContractAssociation, ContractBase, ContractStructure, ContractParticipants, ContractTransactionHistory, ContractDeposits, ConsumedInfo, ContractWinners, ContinuosContractBase, ContractStructure2, ContractParticipants2, ContractDeposits2, ContractTransactionHistory2, SystemBase, ActiveContracts, SystemData, ContractAddressMapping, TokenAddressMapping, DatabaseTypeMapping, TimeActions, RejectedContractTransactionHistory, RejectedTransactionHistory, LatestCacheBase, LatestTransactions, LatestBlocks
import json
from backend_main import processTransaction, checkLocal_expiry_trigger_deposit, newMultiRequest
import os
import logging
import argparse
import configparser
import shutil
import sys
import pdb
# helper functions
def check_database_existence(type, parameters):
if type == 'token':
return os.path.isfile(f"./tokens/{parameters['token_name']}.db")
if type == 'smart_contract':
return os.path.isfile(f"./smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db")
def create_database_connection(type, parameters):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
connection = engine.connect()
return connection
def create_database_session_orm(type, parameters, base):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
return session
# MAIN EXECUTION STARTS
# Configuration of required variables
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
file_handler = logging.FileHandler('tracking.log')
file_handler.setLevel(logging.INFO)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
# Rule 1 - Read command line arguments to reset the databases as blank
# Rule 2 - Read config to set testnet/mainnet
# Rule 3 - Set flo blockexplorer location depending on testnet or mainnet
# Rule 4 - Set the local flo-cli path depending on testnet or mainnet ( removed this feature | Flosights are the only source )
# Rule 5 - Set the block number to scan from
# Read command line arguments
parser = argparse.ArgumentParser(description='Script tracks RMT using FLO data on the FLO blockchain - https://flo.cash')
parser.add_argument('-rb', '--toblocknumer', nargs='?', type=int, help='Forward to the specified block number')
parser.add_argument('-r', '--blockcount', nargs='?', type=int, help='Forward to the specified block count')
parser.add_argument('-to', '--to_blockNumber', nargs='?', type=int, help='Process until the specified block number') # New argument
args = parser.parse_args()
if (args.blockcount and args.toblocknumber):
print("You can only specify one of the options -b or -c")
sys.exit(0)
elif args.blockcount:
forward_block = lastscannedblock + args.blockcount
elif args.toblocknumer:
forward_block = args.toblocknumer
else:
latestCache_session = create_database_session_orm('system_dbs', {'db_name':'latestCache'}, LatestCacheBase)
forward_block = int(latestCache_session.query(LatestBlocks.blockNumber).order_by(LatestBlocks.blockNumber.desc()).first()[0])
latestCache_session.close()
args = parser.parse_args()
apppath = os.path.dirname(os.path.realpath(__file__))
dirpath = os.path.join(apppath, 'tokens')
if not os.path.isdir(dirpath):
os.mkdir(dirpath)
dirpath = os.path.join(apppath, 'smartContracts')
if not os.path.isdir(dirpath):
os.mkdir(dirpath)
# rename all the old databases
# system.db , latestCache.db, smartContracts, tokens
if os.path.isfile('./system.db'):
os.rename('system.db', 'system1.db')
if os.path.isfile('./latestCache.db'):
os.rename('latestCache.db', 'latestCache1.db')
if os.path.isfile('./smartContracts'):
os.rename('smartContracts', 'smartContracts1')
if os.path.isfile('./tokens'):
os.rename('tokens', 'tokens1')
# Read configuration
config = configparser.ConfigParser()
config.read('config.ini')
# todo - write all assertions to make sure default configs are right
if (config['DEFAULT']['NET'] != 'mainnet') and (config['DEFAULT']['NET'] != 'testnet'):
logger.error("NET parameter in config.ini invalid. Options are either 'mainnet' or 'testnet'. Script is exiting now")
sys.exit(0)
# Specify mainnet and testnet server list for API calls and websocket calls
serverlist = None
if config['DEFAULT']['NET'] == 'mainnet':
serverlist = config['DEFAULT']['MAINNET_BLOCKBOOK_SERVER_LIST']
elif config['DEFAULT']['NET'] == 'testnet':
serverlist = config['DEFAULT']['TESTNET_BLOCKBOOK_SERVER_LIST']
serverlist = serverlist.split(',')
neturl = config['DEFAULT']['BLOCKBOOK_NETURL']
tokenapi_sse_url = config['DEFAULT']['TOKENAPI_SSE_URL']
# Delete database and smartcontract directory if reset is set to 1
#if args.reset == 1:
logger.info("Resetting the database. ")
apppath = os.path.dirname(os.path.realpath(__file__))
dirpath = os.path.join(apppath, 'tokens')
shutil.rmtree(dirpath)
os.mkdir(dirpath)
dirpath = os.path.join(apppath, 'smartContracts')
shutil.rmtree(dirpath)
os.mkdir(dirpath)
dirpath = os.path.join(apppath, 'system.db')
if os.path.exists(dirpath):
os.remove(dirpath)
dirpath = os.path.join(apppath, 'latestCache.db')
if os.path.exists(dirpath):
os.remove(dirpath)
# Read start block no
startblock = int(config['DEFAULT']['START_BLOCK'])
session = create_database_session_orm('system_dbs', {'db_name': "system"}, SystemBase)
session.add(SystemData(attribute='lastblockscanned', value=startblock - 1))
session.commit()
session.close()
# Initialize latest cache DB
session = create_database_session_orm('system_dbs', {'db_name': "latestCache"}, LatestCacheBase)
session.commit()
session.close()
# get all blocks and transaction data
latestCache_session = create_database_session_orm('system_dbs', {'db_name':'latestCache1'}, LatestCacheBase)
if forward_block:
lblocks = latestCache_session.query(LatestBlocks).filter(LatestBlocks.blockNumber <= forward_block).all()
ltransactions = latestCache_session.query(LatestTransactions).filter(LatestTransactions.blockNumber <= forward_block).all()
else:
lblocks = latestCache_session.query(LatestBlocks).all()
ltransactions = latestCache_session.query(LatestTransactions).all()
latestCache_session.close()
# make a list of all internal tx block numbers
systemDb_session = create_database_session_orm('system_dbs', {'db_name':'system1'}, SystemBase)
internal_action_blocks = systemDb_session.query(ActiveContracts.blockNumber).all()
internal_action_blocks = [block[0] for block in internal_action_blocks]
internal_action_blocks = sorted(internal_action_blocks)
lblocks_dict = {}
for block in lblocks:
block_dict = block.__dict__
print(block_dict['blockNumber'])
lblocks_dict[block_dict['blockNumber']] = {'blockHash':f"{block_dict['blockHash']}", 'jsonData':f"{block_dict['jsonData']}"}
# process and rebuild all transactions
prev_block = 0
for transaction in ltransactions:
transaction_dict = transaction.__dict__
current_block = transaction_dict['blockNumber']
# Check if any internal action block lies between prev_block and current_block
for internal_block in internal_action_blocks:
if prev_block < internal_block <= current_block:
logger.info(f'Processing block {internal_block}')
# Get block details
response = newMultiRequest(f"block-index/{internal_block}")
blockhash = response['blockHash']
blockinfo = newMultiRequest(f"block/{blockhash}")
# Call your function here, passing the internal block to it
checkLocal_expiry_trigger_deposit(blockinfo)
transaction_data = json.loads(transaction_dict['jsonData'])
transaction_data = newMultiRequest(f"tx/{transaction_dict['transactionHash']}")
parsed_flodata = json.loads(transaction_dict['parsedFloData'])
try:
block_info = json.loads(lblocks_dict[transaction_dict['blockNumber']]['jsonData'])
processTransaction(transaction_data, parsed_flodata, block_info)
prev_block = current_block
except:
prev_block = current_block
continue
# Check if the current block exceeds the specified "to_blockNumber"
if current_block >= args.to_blockNumber:
logger.info(f"Reached the specified block number {args.to_blockNumber}. Stopping processing.")
break
# copy the old block data
old_latest_cache = create_database_connection('system_dbs', {'db_name':'latestCache1'})
old_latest_cache.execute("ATTACH DATABASE 'latestCache.db' AS new_db")
old_latest_cache.execute("INSERT INTO new_db.latestBlocks SELECT * FROM latestBlocks WHERE blockNumber <= ?", (forward_block,))
old_latest_cache.close()
# delete
# system.db , latestCache.db, smartContracts, tokens
if os.path.isfile('./system1.db'):
os.remove('system1.db')
if os.path.isfile('./latestCache1.db'):
os.remove('latestCache1.db')
if os.path.isfile('./smartContracts1'):
shutil.rmtree('smartContracts1')
if os.path.isfile('./tokens1'):
shutil.rmtree('tokens1')
# Update system.db's last scanned block
connection = create_database_connection('system_dbs', {'db_name': "system"})
connection.execute(f"UPDATE systemData SET value = {int(list(lblocks_dict.keys())[-1])} WHERE attribute = 'lastblockscanned';")
connection.close()

View File

@ -0,0 +1,111 @@
from sqlalchemy import create_engine, desc, func
from sqlalchemy.orm import sessionmaker
from models import SystemData, TokenBase, ActiveTable, ConsumedTable, TransferLogs, TransactionHistory, TokenContractAssociation, ContractBase, ContractStructure, ContractParticipants, ContractTransactionHistory, ContractDeposits, ConsumedInfo, ContractWinners, ContinuosContractBase, ContractStructure2, ContractParticipants2, ContractDeposits2, ContractTransactionHistory2, SystemBase, ActiveContracts, SystemData, ContractAddressMapping, TokenAddressMapping, DatabaseTypeMapping, TimeActions, RejectedContractTransactionHistory, RejectedTransactionHistory, LatestCacheBase, LatestTransactions, LatestBlocks
import json
from backend_main import processTransaction, checkLocal_expiry_trigger_deposit, newMultiRequest
import os
import logging
import argparse
import configparser
import shutil
import sys
import pdb
# helper functions
def check_database_existence(type, parameters):
if type == 'token':
return os.path.isfile(f"./tokens/{parameters['token_name']}.db")
if type == 'smart_contract':
return os.path.isfile(f"./smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db")
def create_database_connection(type, parameters):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
connection = engine.connect()
return connection
def create_database_session_orm(type, parameters, base):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
return session
# MAIN EXECUTION STARTS
# Configuration of required variables
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s:%(name)s:%(message)s')
file_handler = logging.FileHandler('tracking.log')
file_handler.setLevel(logging.INFO)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
# Rule 1 - Read command line arguments to reset the databases as blank
# Rule 2 - Read config to set testnet/mainnet
# Rule 3 - Set flo blockexplorer location depending on testnet or mainnet
# Rule 4 - Set the local flo-cli path depending on testnet or mainnet ( removed this feature | Flosights are the only source )
# Rule 5 - Set the block number to scan from
# Read command line arguments
parser = argparse.ArgumentParser(description='Script tracks RMT using FLO data on the FLO blockchain - https://flo.cash')
parser.add_argument('-rb', '--resetblocknumer', nargs='?', type=int, help='Forward to the specified block number')
args = parser.parse_args()
# Read configuration
config = configparser.ConfigParser()
config.read('config.ini')
# todo - write all assertions to make sure default configs are right
if (config['DEFAULT']['NET'] != 'mainnet') and (config['DEFAULT']['NET'] != 'testnet'):
logger.error("NET parameter in config.ini invalid. Options are either 'mainnet' or 'testnet'. Script is exiting now")
sys.exit(0)
# Specify mainnet and testnet server list for API calls and websocket calls
serverlist = None
if config['DEFAULT']['NET'] == 'mainnet':
serverlist = config['DEFAULT']['MAINNET_BLOCKBOOK_SERVER_LIST']
elif config['DEFAULT']['NET'] == 'testnet':
serverlist = config['DEFAULT']['TESTNET_BLOCKBOOK_SERVER_LIST']
serverlist = serverlist.split(',')
neturl = config['DEFAULT']['BLOCKBOOK_NETURL']
tokenapi_sse_url = config['DEFAULT']['TOKENAPI_SSE_URL']
# Update system.db's last scanned block
connection = create_database_connection('system_dbs', {'db_name': "system"})
print(f"UPDATE systemData SET value = {int(args.resetblocknumer)} WHERE attribute = 'lastblockscanned';")
pdb.set_trace()
connection.execute(f"UPDATE systemData SET value = {int(args.resetblocknumer)} WHERE attribute = 'lastblockscanned';")
connection.close()

View File

@ -0,0 +1,483 @@
import argparse
from sqlalchemy import create_engine, func
from sqlalchemy.orm import sessionmaker
from src.backend.models import SystemData, TokenBase, ActiveTable, ConsumedTable, TransferLogs, TransactionHistory, TokenContractAssociation, RejectedTransactionHistory, ContractBase, ContractStructure, ContractParticipants, ContractTransactionHistory, ContractDeposits, ConsumedInfo, ContractWinners, ContinuosContractBase, ContractStructure2, ContractParticipants2, ContractDeposits2, ContractTransactionHistory2, SystemBase, ActiveContracts, SystemData, ContractAddressMapping, TokenAddressMapping, DatabaseTypeMapping, TimeActions, RejectedContractTransactionHistory, RejectedTransactionHistory, LatestCacheBase, LatestTransactions, LatestBlocks
from ast import literal_eval
import os
import json
import logging
import sys
from src.backend.parsing import perform_decimal_operation
apppath = os.path.dirname(os.path.realpath(__file__))
# helper functions
def check_database_existence(type, parameters):
if type == 'token':
return os.path.isfile(f"./tokens/{parameters['token_name']}.db")
if type == 'smart_contract':
return os.path.isfile(f"./smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db")
def create_database_connection(type, parameters):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
connection = engine.connect()
return connection
def create_database_session_orm(type, parameters, base):
if type == 'token':
engine = create_engine(f"sqlite:///tokens/{parameters['token_name']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'smart_contract':
engine = create_engine(f"sqlite:///smartContracts/{parameters['contract_name']}-{parameters['contract_address']}.db", echo=True)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
elif type == 'system_dbs':
engine = create_engine(f"sqlite:///{parameters['db_name']}.db", echo=False)
base.metadata.create_all(bind=engine)
session = sessionmaker(bind=engine)()
else:
pdb.set_trace()
return session
def inspect_parsed_flodata(parsed_flodata, inputAddress, outputAddress):
if parsed_flodata['type'] == 'transfer':
if parsed_flodata['transferType'] == 'token':
return {'type':'tokentransfer', 'token_db':f"{parsed_flodata['tokenIdentification']}", 'token_amount':f"{parsed_flodata['tokenAmount']}"}
if parsed_flodata['transferType'] == 'smartContract':
return {'type':'smartContract', 'contract_db': f"{parsed_flodata['contractName']}-{outputAddress}" ,'accepting_token_db':f"{parsed_flodata['']}", 'receiving_token_db':f"{parsed_flodata['tokenIdentification']}" ,'token_amount':f"{parsed_flodata['tokenAmount']}"}
if parsed_flodata['transferType'] == 'swapParticipation':
return {'type':'swapParticipation', 'contract_db': f"{parsed_flodata['contractName']}-{outputAddress}" ,'accepting_token_db':f"{parsed_flodata['']}", 'receiving_token_db':f"{parsed_flodata['tokenIdentification']}" ,'token_amount':f"{parsed_flodata['tokenAmount']}"}
if parsed_flodata['transferType'] == 'nft':
return {'type':'nfttransfer', 'nft_db':f"{parsed_flodata['tokenIdentification']}", 'token_amount':f"{parsed_flodata['tokenAmount']}"}
if parsed_flodata['type'] == 'tokenIncorporation':
return {'type':'tokenIncorporation', 'token_db':f"{parsed_flodata['tokenIdentification']}", 'token_amount':f"{parsed_flodata['tokenAmount']}"}
if parsed_flodata['type'] == 'smartContractPays':
# contract address, token | both of them come from
sc_session = create_database_session_orm('smart_contract', {'contract_name':f"{parsed_flodata['contractName']}", 'contract_address':f"{outputAddress}"}, ContractBase)
token_db = sc_session.query(ContractStructure.value).filter(ContractStructure.attribute=='tokenIdentification').first()[0]
return {'type':'smartContractPays', 'token_db':f"{token_db}" , 'contract_db':f"{parsed_flodata['contractName']}-{outputAddress}", 'triggerCondition':f"{parsed_flodata['triggerCondition']}"}
if parsed_flodata['type'] == 'smartContractIncorporation':
return {'type':'smartContractIncorporation', 'contract_db':f"{parsed_flodata['contractName']}-{outputAddress}", 'triggerCondition':f"{parsed_flodata['triggerCondition']}"}
def getDatabase_from_parsedFloData(parsed_flodata, inputAddress, outputAddress):
tokenlist = []
contractlist = []
if parsed_flodata['type'] == 'transfer':
if parsed_flodata['transferType'] == 'token':
#return {'type':'token_db', 'token_db':f"{parsed_flodata['tokenIdentification']}"}
tokenlist.append(parsed_flodata['tokenIdentification'])
elif parsed_flodata['transferType'] == 'smartContract':
#return {'type':'smartcontract_db', 'contract_db': f"{parsed_flodata['contractName']}-{outputAddress}" ,'token_db':f"{parsed_flodata['tokenIdentification']}"}
tokenlist.append(parsed_flodata['tokenIdentification'])
contractlist.append(f"{parsed_flodata['contractName']}-{outputAddress}")
elif parsed_flodata['transferType'] == 'swapParticipation':
#return {'type':'swapcontract_db', 'contract_db': f"{parsed_flodata['contractName']}-{outputAddress}" ,'accepting_token_db':f"{parsed_flodata['contract-conditions']['accepting_token']}", 'selling_token_db':f"{parsed_flodata['contract-conditions']['selling_token']}"}
tokenlist.append(parsed_flodata['contract-conditions']['accepting_token'])
tokenlist.append(parsed_flodata['contract-conditions']['selling_token'])
contractlist.append(f"{parsed_flodata['contractName']}-{outputAddress}")
elif parsed_flodata['transferType'] == 'nft':
#return {'type':'nft_db', 'token_db':f"{parsed_flodata['tokenIdentification']}"}
tokenlist.append(parsed_flodata['tokenIdentification'])
elif parsed_flodata['type'] == 'smartContractPays':
# contract address, token | both of them come from
sc_session = create_database_session_orm('smart_contract', {'contract_name':f"{parsed_flodata['contractName']}", 'contract_address':f"{outputAddress}"}, ContractBase)
token_db = sc_session.query(ContractStructure.value).filter(ContractStructure.attribute=='tokenIdentification').first()[0]
#return {'type':'smartcontract_db', 'contract_db':f"{parsed_flodata['contractName']}-{outputAddress}", 'token_db':f"{token_db}"}
tokenlist.append(token_db)
contractlist.append(f"{parsed_flodata['contractName']}-{outputAddress}")
elif parsed_flodata['type'] == 'smartContractIncorporation':
#return {'type':'smartcontract_db', 'contract_db':f"{parsed_flodata['contractName']}-{outputAddress}"}
contractlist.append(f"{parsed_flodata['contractName']}-{outputAddress}")
elif parsed_flodata['type'] == 'tokenIncorporation':
#return {'type':'token_db', 'token_db':f"{parsed_flodata['tokenIdentification']}"}
tokenlist.append(parsed_flodata['tokenIdentification'])
return tokenlist, contractlist
def calc_pid_amount(transferBalance, consumedpid):
consumedpid_sum = 0
for key in list(consumedpid.keys()):
consumedpid_sum = perform_decimal_operation('addition', consumedpid_sum, float(consumedpid[key]))
return transferBalance - consumedpid_sum
def find_addressBalance_from_floAddress(database_session, floAddress):
query_output = database_session.query(ActiveTable).filter(ActiveTable.address==floAddress, ActiveTable.addressBalance!=None).first()
if query_output is None:
return 0
else:
return query_output.addressBalance
def rollback_address_balance_processing(db_session, senderAddress, receiverAddress, transferBalance):
# Find out total sum of address
# Find out the last entry where address balance is not null, if exists make it null
# Calculation phase
current_receiverBalance = find_addressBalance_from_floAddress(db_session, receiverAddress)
current_senderBalance = find_addressBalance_from_floAddress(db_session ,senderAddress)
new_receiverBalance = perform_decimal_operation('subtraction', current_receiverBalance, transferBalance)
new_senderBalance = perform_decimal_operation('addition', current_senderBalance, transferBalance)
# Insertion phase
# if new receiver balance is 0, then only insert sender address balance
# if receiver balance is not 0, then update previous occurence of the receiver address and sender balance
# for sender, find out weather
# either query out will not come or the last occurence will have address
# for sender, in all cases we will update the addressBalance of last occurences of senderfloaddress
# for receiver, if the currentaddressbalance is 0 then do nothing .. and if the currentaddressbalance is not 0 then update the last occurence of receiver address
sender_query = db_session.query(ActiveTable).filter(ActiveTable.address==senderAddress).order_by(ActiveTable.id.desc()).first()
sender_query.addressBalance = new_senderBalance
if new_receiverBalance != 0 and new_receiverBalance > 0:
receiver_query = db_session.query(ActiveTable).filter(ActiveTable.address==receiverAddress).order_by(ActiveTable.id.desc()).limit(2).all()
if len(receiver_query) == 2:
receiver_query[1].addressBalance = new_receiverBalance
def find_input_output_addresses(transaction_data):
# Create vinlist and outputlist
vinlist = []
querylist = []
for vin in transaction_data["vin"]:
vinlist.append([vin["addresses"][0], float(vin["value"])])
totalinputval = float(transaction_data["valueIn"])
# todo Rule 41 - Check if all the addresses in a transaction on the input side are the same
for idx, item in enumerate(vinlist):
if idx == 0:
temp = item[0]
continue
if item[0] != temp:
print(f"System has found more than one address as part of vin. Transaction {transaction_data['txid']} is rejected")
return 0
inputlist = [vinlist[0][0], totalinputval]
inputadd = vinlist[0][0]
# todo Rule 42 - If the number of vout is more than 2, reject the transaction
if len(transaction_data["vout"]) > 2:
print(f"System has found more than 2 address as part of vout. Transaction {transaction_data['txid']} is rejected")
return 0
# todo Rule 43 - A transaction accepted by the system has two vouts, 1. The FLO address of the receiver
# 2. Flo address of the sender as change address. If the vout address is change address, then the other adddress
# is the recevier address
outputlist = []
addresscounter = 0
inputcounter = 0
for obj in transaction_data["vout"]:
if obj["scriptPubKey"]["type"] == "pubkeyhash":
addresscounter = addresscounter + 1
if inputlist[0] == obj["scriptPubKey"]["addresses"][0]:
inputcounter = inputcounter + 1
continue
outputlist.append([obj["scriptPubKey"]["addresses"][0], obj["value"]])
if addresscounter == inputcounter:
outputlist = [inputlist[0]]
elif len(outputlist) != 1:
print(f"Transaction's change is not coming back to the input address. Transaction {transaction_data['txid']} is rejected")
return 0
else:
outputlist = outputlist[0]
return inputlist[0], outputlist[0]
def rollback_database(blockNumber, dbtype, dbname):
if dbtype == 'token':
# Connect to database
db_session = create_database_session_orm('token', {'token_name':dbname}, TokenBase)
while(True):
subqry = db_session.query(func.max(ActiveTable.id))
activeTable_entry = db_session.query(ActiveTable).filter(ActiveTable.id == subqry).first()
if activeTable_entry.blockNumber <= blockNumber:
break
outputAddress = activeTable_entry.address
transferAmount = activeTable_entry.transferBalance
inputAddress = None
# Find out consumedpid and partially consumed pids
parentid = None
orphaned_parentid = None
consumedpid = None
if activeTable_entry.parentid is not None:
parentid = activeTable_entry.parentid
if activeTable_entry.orphaned_parentid is not None:
orphaned_parentid = activeTable_entry.orphaned_parentid
if activeTable_entry.consumedpid is not None:
consumedpid = literal_eval(activeTable_entry.consumedpid)
# filter out based on consumped pid and partially consumed pids
if parentid is not None:
# find query in activeTable with the parentid
activeTable_pid_entry = db_session.query(ActiveTable).filter(ActiveTable.id == parentid).all()[0]
# calculate the amount taken from parentid
activeTable_pid_entry.transferBalance = activeTable_pid_entry.transferBalance + calc_pid_amount(activeTable_entry.transferBalance, consumedpid)
inputAddress = activeTable_pid_entry.address
if orphaned_parentid is not None:
try:
orphaned_parentid_entry = db_session.query(ConsumedTable).filter(ConsumedTable.id == orphaned_parentid).all()[0]
inputAddress = orphaned_parentid_entry.address
except:
pdb.set_trace()
if consumedpid != {}:
# each key of the pid is totally consumed and with its corresponding value written in the end
# how can we maintain the order of pid consumption? The bigger pid number will be towards the end
# 1. pull the pid number and its details from the consumedpid table
for key in list(consumedpid.keys()):
consumedpid_entry = db_session.query(ConsumedTable).filter(ConsumedTable.id == key).all()[0]
newTransferBalance = consumedpid_entry.transferBalance + consumedpid[key]
db_session.add(ActiveTable(id=consumedpid_entry.id, address=consumedpid_entry.address, parentid=consumedpid_entry.parentid ,consumedpid=consumedpid_entry.consumedpid, transferBalance=newTransferBalance, addressBalance = None, orphaned_parentid=consumedpid_entry.orphaned_parentid ,blockNumber=consumedpid_entry.blockNumber))
inputAddress = consumedpid_entry.address
db_session.delete(consumedpid_entry)
orphaned_parentid_entries = db_session.query(ActiveTable).filter(ActiveTable.orphaned_parentid == key).all()
if len(orphaned_parentid_entries) != 0:
for orphan_entry in orphaned_parentid_entries:
orphan_entry.parentid = orphan_entry.orphaned_parentid
orphan_entry.orphaned_parentid = None
orphaned_parentid_entries = db_session.query(ConsumedTable).filter(ConsumedTable.orphaned_parentid == key).all()
if len(orphaned_parentid_entries) != 0:
for orphan_entry in orphaned_parentid_entries:
orphan_entry.parentid = orphan_entry.orphaned_parentid
orphan_entry.orphaned_parentid = None
# update addressBalance
rollback_address_balance_processing(db_session, inputAddress, outputAddress, transferAmount)
# delete operations
# delete the last row in activeTable and transactionTable
db_session.delete(activeTable_entry)
db_session.query(TransactionHistory).filter(TransactionHistory.blockNumber > blockNumber).delete()
db_session.query(TransferLogs).filter(TransferLogs.blockNumber > blockNumber).delete()
db_session.commit()
elif dbtype == 'smartcontract':
db_session = create_database_session_orm('smart_contract', {'contract_name':f"{dbname['contract_name']}", 'contract_address':f"{dbname['contract_address']}"}, ContractBase)
db_session.query(ContractTransactionHistory).filter(ContractTransactionHistory.blockNumber > blockNumber).delete()
db_session.query(ContractParticipants).filter(ContractParticipants.blockNumber > blockNumber).delete()
db_session.query(ContractDeposits).filter(ContractDeposits.blockNumber > blockNumber).delete()
db_session.query(ConsumedInfo).filter(ConsumedInfo.blockNumber > blockNumber).delete()
db_session.query(ContractWinners).filter(ContractWinners.blockNumber > blockNumber).delete()
db_session.commit()
def delete_database_old(blockNumber, dbname):
db_session = create_database_session_orm('system_dbs', {'db_name':'system'}, SystemBase)
databases_to_delete = db_session.query(DatabaseTypeMapping.db_name, DatabaseTypeMapping.db_type).filter(DatabaseTypeMapping.blockNumber>blockNumber).all()
db_names, db_type = zip(*databases_to_delete)
for database in databases_to_delete:
if database[1] in ['token','infinite-token']:
dirpath = os.path.join(apppath, 'tokens', f"{dbname}.db")
if os.path.exists(dirpath):
os.remove(dirpath)
elif database[1] in ['smartcontract']:
dirpath = os.path.join(apppath, 'smartcontracts', f"{dbname}.db")
if os.path.exists(dirpath):
os.remove(dirpath)
return db_names
def delete_database(blockNumber, dbname):
db_session = create_database_session_orm('system_dbs', {'db_name':'system'}, SystemBase)
databases_to_delete = db_session.query(DatabaseTypeMapping.db_name, DatabaseTypeMapping.db_type).filter(DatabaseTypeMapping.db_name == dbname).all()
db_names, db_type = zip(*databases_to_delete)
for database in databases_to_delete:
if database[1] in ['token','infinite-token','nft']:
dirpath = os.path.join(apppath, 'tokens', f"{dbname}.db")
if os.path.exists(dirpath):
os.remove(dirpath)
elif database[1] in ['smartcontract']:
dirpath = os.path.join(apppath, 'smartContracts', f"{dbname}.db")
if os.path.exists(dirpath):
os.remove(dirpath)
return db_names
def system_database_deletions(blockNumber):
latestcache_session = create_database_session_orm('system_dbs', {'db_name': 'latestCache'}, LatestCacheBase)
# delete latestBlocks & latestTransactions entry
latestcache_session.query(LatestBlocks).filter(LatestBlocks.blockNumber > blockNumber).delete()
latestcache_session.query(LatestTransactions).filter(LatestTransactions.blockNumber > blockNumber).delete()
# delete activeContracts, contractAddressMapping, DatabaseAddressMapping, rejectedContractTransactionHistory, rejectedTransactionHistory, tokenAddressMapping
systemdb_session = create_database_session_orm('system_dbs', {'db_name': 'system'}, SystemBase)
activeContracts_session = systemdb_session.query(ActiveContracts).filter(ActiveContracts.blockNumber > blockNumber).delete()
contractAddressMapping_queries = systemdb_session.query(ContractAddressMapping).filter(ContractAddressMapping.blockNumber > blockNumber).delete()
databaseTypeMapping_queries = systemdb_session.query(DatabaseTypeMapping).filter(DatabaseTypeMapping.blockNumber > blockNumber).delete()
rejectedContractTransactionHistory_queries = systemdb_session.query(RejectedContractTransactionHistory).filter(RejectedContractTransactionHistory.blockNumber > blockNumber).delete()
rejectedTransactionHistory_queries = systemdb_session.query(RejectedTransactionHistory).filter(RejectedTransactionHistory.blockNumber > blockNumber).delete()
tokenAddressMapping_queries = systemdb_session.query(TokenAddressMapping).filter(TokenAddressMapping.blockNumber > blockNumber).delete()
timeAction_queries = systemdb_session.query(TimeActions).filter(TimeActions.blockNumber > blockNumber).delete()
systemdb_session.query(SystemData).filter(SystemData.attribute=='lastblockscanned').update({SystemData.value:str(blockNumber)})
latestcache_session.commit()
systemdb_session.commit()
latestcache_session.close()
systemdb_session.close()
def return_token_contract_set(rollback_block):
latestcache_session = create_database_session_orm('system_dbs', {'db_name': 'latestCache'}, LatestCacheBase)
latestBlocks = latestcache_session.query(LatestBlocks).filter(LatestBlocks.blockNumber > rollback_block).all()
lblocks_dict = {}
blocknumber_list = []
for block in latestBlocks:
block_dict = block.__dict__
lblocks_dict[block_dict['blockNumber']] = {'blockHash':f"{block_dict['blockHash']}", 'jsonData':f"{block_dict['jsonData']}"}
blocknumber_list.insert(0,block_dict['blockNumber'])
tokendb_set = set()
smartcontractdb_set = set()
for blockindex in blocknumber_list:
# Find the all the transactions that happened in this block
try:
block_tx_hashes = json.loads(lblocks_dict[str(blockindex)]['jsonData'])['tx']
except:
print(f"Block {blockindex} is not found in latestCache. Skipping this block")
continue
for txhash in block_tx_hashes:
# Get the transaction details
transaction = latestcache_session.query(LatestTransactions).filter(LatestTransactions.transactionHash == txhash).first()
transaction_data = json.loads(transaction.jsonData)
inputAddress, outputAddress = find_input_output_addresses(transaction_data)
parsed_flodata = literal_eval(transaction.parsedFloData)
tokenlist, contractlist = getDatabase_from_parsedFloData(parsed_flodata, inputAddress, outputAddress)
for token in tokenlist:
tokendb_set.add(token)
for contract in contractlist:
smartcontractdb_set.add(contract)
return tokendb_set, smartcontractdb_set
def initiate_rollback_process():
'''
tokendb_set, smartcontractdb_set = return_token_contract_set(rollback_block)
'''
# Connect to system.db
systemdb_session = create_database_session_orm('system_dbs', {'db_name': 'system'}, SystemBase)
db_names = systemdb_session.query(DatabaseTypeMapping).all()
for db in db_names:
if db.db_type in ['token', 'nft', 'infinite-token']:
if db.blockNumber > rollback_block:
delete_database(rollback_block, f"{db.db_name}")
else:
rollback_database(rollback_block, 'token', f"{db.db_name}")
elif db.db_type in ['smartcontract']:
if db.blockNumber > rollback_block:
delete_database(rollback_block, f"{db.db_name}")
else:
db_split = db.db_name.rsplit('-',1)
db_name = {'contract_name':db_split[0], 'contract_address':db_split[1]}
rollback_database(rollback_block, 'smartcontract', db_name)
'''
for token_db in tokendb_set:
token_session = create_database_session_orm('token', {'token_name': token_db}, TokenBase)
if token_session.query(TransactionHistory.blockNumber).first()[0] > rollback_block:
delete_database(rollback_block, token_db)
token_session.commit()
else:
rollback_database(rollback_block, 'token', token_db)
token_session.close()
for contract_db in smartcontractdb_set:
contract_session = create_database_session_orm('smartcontract', {'db_name': contract_db}, ContractBase)
if contract_session.query(TransactionHistory.blockNumber).first()[0] > rollback_block:
delete_database(rollback_block, contract_db)
contract_session.commit()
else:
rollback_database(rollback_block, 'smartcontract', contract_db)
contract_session.close()
'''
system_database_deletions(rollback_block)
# update lastblockscanned in system_dbs
latestCache_session = create_database_session_orm('system_dbs', {'db_name': 'latestCache'}, LatestCacheBase)
lastblockscanned = latestCache_session.query(LatestBlocks.blockNumber).order_by(LatestBlocks.id.desc()).first()[0]
latestCache_session.close()
systemdb_session = create_database_session_orm('system_dbs', {'db_name': 'system'}, SystemBase)
lastblockscanned_query = systemdb_session.query(SystemData).filter(SystemData.attribute=='lastblockscanned').first()
lastblockscanned_query.value = rollback_block
systemdb_session.commit()
systemdb_session.close()
def rollback_to_block(block_number):
global rollback_block
rollback_block = block_number
start_rollback_process()
def start_rollback_process():
systemdb_session = create_database_session_orm('system_dbs', {'db_name': 'system'}, SystemBase)
lastblockscanned_query = systemdb_session.query(SystemData).filter(SystemData.attribute=='lastblockscanned').first()
if(rollback_block > int(lastblockscanned_query.value)):
print('Rollback block is greater than the last scanned block\n Exiting ....')
sys.exit(0)
else:
initiate_rollback_process()
if __name__ == "__main__":
# Take input from user reg how many blocks to go back in the blockchain
parser = argparse.ArgumentParser(description='Script tracks RMT using FLO data on the FLO blockchain - https://flo.cash')
parser.add_argument('-rb', '--toblocknumer', nargs='?', type=int, help='Rollback the script to the specified block number')
parser.add_argument('-r', '--blockcount', nargs='?', type=int, help='Rollback the script to the number of blocks specified')
args = parser.parse_args()
# Get all the transaction and blockdetails from latestCache reg the transactions in the block
systemdb_session = create_database_session_orm('system_dbs', {'db_name': 'system'}, SystemBase)
lastscannedblock = systemdb_session.query(SystemData.value).filter(SystemData.attribute=='lastblockscanned').first()
systemdb_session.close()
lastscannedblock = int(lastscannedblock.value)
if (args.blockcount and args.toblocknumber):
print("You can only specify one of the options -b or -c")
sys.exit(0)
elif args.blockcount:
rollback_block = lastscannedblock - args.blockcount
elif args.toblocknumer:
rollback_block = args.toblocknumer
else:
print("Please specify the number of blocks to rollback")
sys.exit(0)
start_rollback_process()

52
src/flags.py Normal file
View File

@ -0,0 +1,52 @@
FLAGS = {}
FLAGS["is_running"] = None
FLAGS["is_backend_active"] = None
FLAGS["is_backend_syncing"] = None
FLAGS["is_backend_ready"] = None
FLAGS["is_api_server_active"] = None
def is_running():
return bool(FLAGS["is_running"])
def set_run_start():
FLAGS["is_running"] = True
def set_run_stop():
FLAGS["is_running"] = False
def set_backend_start():
FLAGS["is_backend_active"] = True
def set_backend_stop():
FLAGS["is_backend_active"] = False
def is_backend_active():
return bool(FLAGS["is_backend_active"])
def set_backend_sync_start():
FLAGS["is_backend_syncing"] = True
def set_backend_sync_stop():
FLAGS["is_backend_syncing"] = False
def is_backend_syncing():
return bool(FLAGS["is_backend_syncing"])
def set_backend_ready():
FLAGS["is_backend_ready"] = True
def set_backend_not_ready():
FLAGS["is_backend_ready"] = False
def is_backend_ready():
return bool(FLAGS["is_backend_ready"])
def set_api_start():
FLAGS["is_api_server_active"] = True
def set_api_stop():
FLAGS["is_api_server_active"] = False
def is_api_active():
return bool(FLAGS["is_api_server_active"])

219
tests/test_parsing.py Normal file
View File

@ -0,0 +1,219 @@
import unittest
import sys
sys.path.append("..")
import parsing
class TestParsing(unittest.TestCase):
blockinfo_stub = {'time': 25634}
def test_token_creation(self):
text = 'create 100 rmt#'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'mainnet')
expected_result = {
'type': 'tokenIncorporation',
'flodata': 'create 100 rmt#',
'tokenIdentification': 'rmt',
'tokenAmount': 100.0,
'stateF': False
}
self.assertEqual(result, expected_result)
def test_token_transfer(self):
text = 'transfer 10.340 rmt#'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'mainnet')
expected_result = {
'type': 'transfer',
'transferType': 'token',
'flodata': 'transfer 10.340 rmt#',
'tokenIdentification': 'rmt',
'tokenAmount': 10.34,
'stateF': False
}
self.assertEqual(result, expected_result)
def test_nft_creation(self):
pass
def test_nft_transfer(self):
pass
def test_infinite_token_incorporation(self):
text = 'create usd# as infinite-token'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'mainnet')
expected_result = {
'type': 'infiniteTokenIncorporation',
'flodata': 'create usd# as infinite-token',
'tokenIdentification': 'usd',
'stateF': False
}
self.assertEqual(result, expected_result)
text = 'create usd# as infinite-token send'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'mainnet')
expected_result = {'type': 'noise'}
self.assertEqual(result, expected_result)
def test_infinite_token_transfer(self):
pass
def test_onetimeevent_timetrigger_creation(self):
# contractamount
text = '''Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions:(1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) contractAmount=0.1 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {
'type': 'smartContractIncorporation',
'contractType': 'one-time-event',
'subtype': 'time-trigger',
'tokenIdentification': 'bioscope',
'contractName': 'all-crowd-fund-1',
'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz',
'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) contractAmount=0.1 end-contract-conditions',
'contractConditions': {
'contractAmount': '0.1',
'payeeAddress': {
'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0
},
'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530',
'unix_expiryTime': 1668387900.0
}
}
self.assertEqual(result, expected_result)
# minimumsubscriptionamount
text = '''Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions:(1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype':'time-trigger','tokenIdentification': 'bioscope', 'contractName': 'all-crowd-fund-1', 'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz', 'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 end-contract-conditions', 'contractConditions': {'minimumsubscriptionamount': '1.0', 'payeeAddress': {'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0}, 'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530', 'unix_expiryTime': 1668387900.0}}
self.assertEqual(result, expected_result)
# maximumsubscriptionamount
text = '''Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions:(1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) maximumsubscriptionamount=10 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype': 'time-trigger','tokenIdentification': 'bioscope', 'contractName': 'all-crowd-fund-1', 'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz', 'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) maximumsubscriptionamount=10 end-contract-conditions', 'contractConditions': {'maximumsubscriptionamount': '10.0', 'payeeAddress': {'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0}, 'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530', 'unix_expiryTime': 1668387900.0}}
self.assertEqual(result, expected_result)
# minimumsubscriptionamount | contractamount
text = '''Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions:(1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1.600 (4) contractAmount=0.1 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype': 'time-trigger', 'tokenIdentification': 'bioscope', 'contractName': 'all-crowd-fund-1', 'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz', 'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1.600 (4) contractAmount=0.1 end-contract-conditions', 'contractConditions': {'contractAmount': '0.1', 'minimumsubscriptionamount': '1.6', 'payeeAddress': {'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0}, 'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530', 'unix_expiryTime': 1668387900.0}}
self.assertEqual(result, expected_result)
# maximumsubscriptionamount | contractamount
text = '''Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions:(1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) maximumsubscriptionamount=10 (4) contractAmount=0.1 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype': 'time-trigger', 'tokenIdentification': 'bioscope', 'contractName': 'all-crowd-fund-1', 'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz', 'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) maximumsubscriptionamount=10 (4) contractAmount=0.1 end-contract-conditions', 'contractConditions': {'contractAmount': '0.1', 'maximumsubscriptionamount': '10.0', 'payeeAddress': {'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0}, 'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530', 'unix_expiryTime': 1668387900.0}}
self.assertEqual(result, expected_result)
# minimumsubscriptionamount | maximumsubscriptionamount
text = '''Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions:(1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 (4) maximumsubscriptionamount=10 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype':'time-trigger','tokenIdentification': 'bioscope', 'contractName': 'all-crowd-fund-1', 'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz', 'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 (4) maximumsubscriptionamount=10 end-contract-conditions', 'contractConditions': {'minimumsubscriptionamount': '1.0', 'maximumsubscriptionamount': '10.0', 'payeeAddress': {'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0}, 'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530', 'unix_expiryTime': 1668387900.0}}
self.assertEqual(result, expected_result)
# minimumsubscriptionamount | maximumsubscriptionamount | contractamount
text = '''Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions:(1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 (4) maximumsubscriptionamount=10 (5) contractAmount=0.1 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype': 'time-trigger', 'tokenIdentification': 'bioscope', 'contractName': 'all-crowd-fund-1', 'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz', 'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 (4) maximumsubscriptionamount=10 (5) contractAmount=0.1 end-contract-conditions', 'contractConditions': {'contractAmount': '0.1', 'minimumsubscriptionamount': '1.0', 'maximumsubscriptionamount': '10.0', 'payeeAddress': {'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0}, 'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530', 'unix_expiryTime': 1668387900.0}}
self.assertEqual(result, expected_result)
# With single payeeAddress with : format
text = "Create a smart contract of the name album-fund@ of the type one-time-event* using asset bioscope# at the FLO address ocsiFSsjek3UXKdHpBWF79qrGN6qbpxeMt$ with contract-conditions: (1) expiryTime= Thu May 04 2023 18:57:00 GMT+0530 (India Standard Time) (2) payeeAddress= objfBRUX5zn4W56aHhRn4DgH6xqeRWk6Xc:100 end-contract-conditions"
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype': 'time-trigger', 'tokenIdentification': 'bioscope', 'contractName': 'album-fund', 'contractAddress': 'ocsiFSsjek3UXKdHpBWF79qrGN6qbpxeMt', 'flodata': 'Create a smart contract of the name album-fund@ of the type one-time-event* using asset bioscope# at the FLO address ocsiFSsjek3UXKdHpBWF79qrGN6qbpxeMt$ with contract-conditions: (1) expiryTime= Thu May 04 2023 18:57:00 GMT+0530 (India Standard Time) (2) payeeAddress= objfBRUX5zn4W56aHhRn4DgH6xqeRWk6Xc:100 end-contract-conditions', 'contractConditions': {'payeeAddress': {'objfBRUX5zn4W56aHhRn4DgH6xqeRWk6Xc': 100.0}, 'expiryTime': 'thu may 04 2023 18:57:00 gmt+0530 (india standard time)', 'unix_expiryTime': 1683246420.0}}
self.assertEqual(result, expected_result)
# With single payeeAddress with normal format
text = "Create a smart contract of the name album-fund@ of the type one-time-event* using asset bioscope# at the FLO address ocsiFSsjek3UXKdHpBWF79qrGN6qbpxeMt$ with contract-conditions: (1) expiryTime= Thu May 04 2023 18:57:00 GMT+0530 (India Standard Time) (2) payeeAddress= objfBRUX5zn4W56aHhRn4DgH6xqeRWk6Xc end-contract-conditions"
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype': 'time-trigger', 'tokenIdentification': 'bioscope', 'contractName': 'album-fund', 'contractAddress': 'ocsiFSsjek3UXKdHpBWF79qrGN6qbpxeMt', 'flodata': 'Create a smart contract of the name album-fund@ of the type one-time-event* using asset bioscope# at the FLO address ocsiFSsjek3UXKdHpBWF79qrGN6qbpxeMt$ with contract-conditions: (1) expiryTime= Thu May 04 2023 18:57:00 GMT+0530 (India Standard Time) (2) payeeAddress= objfBRUX5zn4W56aHhRn4DgH6xqeRWk6Xc end-contract-conditions', 'contractConditions': {'payeeAddress': {'objfBRUX5zn4W56aHhRn4DgH6xqeRWk6Xc': 100}, 'expiryTime': 'thu may 04 2023 18:57:00 gmt+0530 (india standard time)', 'unix_expiryTime': 1683246420.0}}
self.assertEqual(result, expected_result)
# With multiple payeeAddress with : format
text = "Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 (4) maximumsubscriptionamount=10 (5) contractAmount=0.1 end-contract-conditions"
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'smartContractIncorporation', 'contractType': 'one-time-event', 'subtype': 'time-trigger', 'tokenIdentification': 'bioscope', 'contractName': 'all-crowd-fund-1', 'contractAddress': 'oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz', 'flodata': 'Create a smart contract of the name all-crowd-fund-1@ of the type one-time-event* using asset bioscope# at the FLO address oQkpZCBcAWc945viKqFmJVbVG4aKY4V3Gz$ with contract-conditions: (1) expiryTime= Sun Nov 13 2022 19:35:00 GMT+0530 (2) payeeAddress=oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7:10:oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij:20:oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5:30:oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ:40 (3) minimumsubscriptionamount=1 (4) maximumsubscriptionamount=10 (5) contractAmount=0.1 end-contract-conditions', 'contractConditions': {'contractAmount': '0.1', 'minimumsubscriptionamount': '1.0', 'maximumsubscriptionamount': '10.0', 'payeeAddress': {'oQotdnMBAP1wZ6Kiofx54S2jNjKGiFLYD7': 10.0, 'oMunmikKvxsMSTYzShm2X5tGrYDt9EYPij': 20.0, 'oRpvvGEVKwWiMnzZ528fPhiA2cZA3HgXY5': 30.0, 'oWpVCjPDGzaiVfEFHs6QVM56V1uY1HyCJJ': 40.0}, 'expiryTime': 'sun nov 13 2022 19:35:00 gmt+0530', 'unix_expiryTime': 1668387900.0}}
self.assertEqual(result, expected_result)
def test_onetimeevent_timetrigger_participation(self):
text = '''send 2.2 bioscope# to all-crowd-fund@'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'transfer', 'transferType': 'smartContract', 'flodata': 'send 2.2 bioscope# to all-crowd-fund@', 'tokenIdentification': 'bioscope', 'tokenAmount': 2.2, 'contractName': 'all-crowd-fund'}
self.assertEqual(result, expected_result)
text = 'transfer 6.20000 bioscope# to all-crowd-fund-7@'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'transfer', 'transferType': 'smartContract', 'flodata': 'transfer 6.20000 bioscope# to all-crowd-fund-7@', 'tokenIdentification': 'bioscope', 'tokenAmount': 6.2, 'contractName': 'all-crowd-fund-7'}
self.assertEqual(result, expected_result)
text = 'transfer 6.20000 bioscope# to all-crowd-fund-7@ 24'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'noise'}
self.assertEqual(result, expected_result)
text = 'transfer 6.20000 bioscope# to all-crowd-fund-7@ 24 '
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'noise'}
self.assertEqual(result, expected_result)
text = '6.20.000 transfer bioscope# to all-crowd-fund-7@ 24'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'transfer', 'transferType': 'smartContract', 'flodata': '6.20.000 transfer bioscope# to all-crowd-fund-7@ 24', 'tokenIdentification': 'bioscope', 'tokenAmount': 24.0, 'contractName': 'all-crowd-fund-7'}
self.assertEqual(result, expected_result)
def test_onetimeevent_externaltrigger_creation(self):
# contractamount
text = '''Create a smart contract of the name twitter-survive@ of the type one-time-event* using asset bioscope# at the FLO address oVbebBNuERWbouDg65zLfdataWEMTnsL8r$ with contract-conditions:(1) expiryTime= Sun Nov 15 2022 14:55:00 GMT+0530 (2) userchoices= survives | dies (3) contractAmount=0.02 end-contract-conditions'''
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {
'type': 'smartContractIncorporation',
'contractType': 'one-time-event',
'subtype': 'external-trigger',
'tokenIdentification': 'bioscope',
'contractName': 'twitter-survive',
'contractAddress': 'oVbebBNuERWbouDg65zLfdataWEMTnsL8r',
'flodata': 'Create a smart contract of the name twitter-survive@ of the type one-time-event* using asset bioscope# at the FLO address oVbebBNuERWbouDg65zLfdataWEMTnsL8r$ with contract-conditions: (1) expiryTime= Sun Nov 15 2022 14:55:00 GMT+0530 (2) userchoices= survives | dies (3) contractAmount=0.02 end-contract-conditions',
'contractConditions': {
'contractAmount': '0.02',
'userchoices': "{0: 'survives', 1: 'dies'}",
'expiryTime': 'sun nov 15 2022 14:55:00 gmt+0530',
'unix_expiryTime': 1668543900.0
}
}
self.assertEqual(result, expected_result)
def test_tokenswap_deposits(self):
text = 'Deposit 1 bioscope# to swap-rupee-bioscope-1@ its FLO address being oTzrcpLPRXsejSdYQ3XN6V4besrAPuJQrk$ with deposit-conditions: (1) expiryTime= Thu Apr 13 2023 21:45:00 GMT+0530'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {
'type': 'smartContractDeposit',
'tokenIdentification': 'bioscope',
'depositAmount': 1.0,
'contractName': 'swap-rupee-bioscope-1',
'flodata': 'Deposit 1 bioscope# to swap-rupee-bioscope-1@ its FLO address being oTzrcpLPRXsejSdYQ3XN6V4besrAPuJQrk$ with deposit-conditions: (1) expiryTime= Thu Apr 13 2023 21:45:00 GMT+0530',
'depositConditions': {
'expiryTime': 'thu apr 13 2023 21:45:00 gmt+0530'
},
'stateF': False}
self.assertEqual(result, expected_result)
def test_contract_trigger(self):
text = 'contract@ triggerCondition:"twitter-survives"'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {
'type': 'smartContractPays',
'contractName': 'contract',
'triggerCondition': 'twitter-survives',
'stateF': False}
self.assertEqual(result, expected_result)
def test_deposit_invalid(self):
text = 'Deposit 1 bioscope# to swap-rupee-bioscope-1@ its FLO address being oTzrcpLPRXsejSdYQ3XN6V4besrAPuJQrk$ with deposit-conditions: (1) expiryTime= Tue, 25 Apr 2023 13:40:00 GMT'
result = parsing.parse_flodata(text, TestParsing.blockinfo_stub, 'testnet')
expected_result = {'type': 'noise'}
self.assertEqual(result, expected_result)
if __name__ == '__main__':
unittest.main()

File diff suppressed because it is too large Load Diff