Merge pull request #1 from zone117x/master

update from original
This commit is contained in:
Alejandro Reyero 2014-05-30 23:57:57 +02:00
commit 6acd346ed8
115 changed files with 6659 additions and 1456 deletions

4
.gitignore vendored
View File

@ -1,3 +1,5 @@
node_modules/
.idea/
config.json
config.json
pool_configs/*.json
!pool_configs/litecoin_example.json

456
README.md
View File

@ -2,10 +2,38 @@
#### Node Open Mining Portal
This portal is an extremely efficient, highly scalable, all-in-one, easy to setup cryptocurrency mining pool written
entirely in Node.js. It contains a stratum poolserver, reward/payment/share processor, and a (*not yet completed*)
front-end website.
entirely in Node.js. It contains a stratum poolserver; reward/payment/share processor; and a (*not yet completed*)
responsive user-friendly front-end website featuring mining instructions, in-depth live statistics, and an admin center.
#### Features
#### Production Usage Notice
This is beta software. All of the following are things that can change and break an existing NOMP setup: functionality of any feature, structure of configuration files and structure of redis data. If you use this software in production then *DO NOT* pull new code straight into production usage because it can and often will break your setup and require you to tweak things like config files or redis data.
#### Table of Contents
* [Features](#features)
* [Attack Mitigation](#attack-mitigation)
* [Security](#security)
* [Planned Features](#planned-features)
* [Community Support](#community--support)
* [Usage](#usage)
* [Requirements](#requirements)
* [Setting Up Coin Daemon](#0-setting-up-coin-daemon)
* [Downloading & Installing](#1-downloading--installing)
* [Configuration](#2-configuration)
* [Portal Config](#portal-config)
* [Coin Config](#coin-config)
* [Pool Config](#pool-config)
* [Setting Up Blocknotify](#optional-recommended-setting-up-blocknotify)
* [Starting the Portal](#3-start-the-portal)
* [Upgrading NOMP](#upgrading-nomp)
* [Donations](#donations)
* [Credits](#credits)
* [License](#license)
### Features
* For the pool server it uses the highly efficient [node-stratum-pool](//github.com/zone117x/node-stratum-pool) module which
supports vardiff, POW & POS, transaction messages, anti-DDoS, IP banning, [several hashing algorithms](//github.com/zone117x/node-stratum-pool#hashing-algorithms-supported).
@ -13,6 +41,7 @@ supports vardiff, POW & POS, transaction messages, anti-DDoS, IP banning, [sever
* The portal has an [MPOS](//github.com/MPOS/php-mpos) compatibility mode so that the it can
function as a drop-in-replacement for [python-stratum-mining](//github.com/Crypto-Expert/stratum-mining). This
mode can be enabled in the configuration and will insert shares into a MySQL database in the format which MPOS expects.
For a direct tutorial see the wiki page [Setting up NOMP for MPOS usage](//github.com/zone117x/node-open-mining-portal/wiki/Setting-up-NOMP-for-MPOS-usage).
* Multi-pool ability - this software was built from the ground up to run with multiple coins simultaneously (which can
have different properties and hashing algorithms). It can be used to create a pool for a single coin or for multiple
@ -20,17 +49,15 @@ coins at once. The pools use clustering to load balance across multiple CPU core
* For reward/payment processing, shares are inserted into Redis (a fast NoSQL key/value store). The PROP (proportional)
reward system is used with [Redis Transactions](http://redis.io/topics/transactions) for secure and super speedy payouts.
Each and every share will be rewarded - even for rounds resulting in orphaned blocks.
There is zero risk to the pool operator. Shares from rounds resulting in orphaned blocks will be merged into share in the
current round so that each and every share will be rewarded
* This portal does not have user accounts/logins/registrations. Instead, miners simply use their coin address for stratum
authentication. A minimalistic HTML5 front-end connects to the portals statistics API to display stats from from each
pool such as connected miners, network/pool difficulty/hash rate, etc.
* Automated switching of connected miners to different pools/coins is also easily done due to the multi-pool architecture
of this software. To use this feature the switching must be controlled by your own script, such as one that calculates
coin profitability via an API such as CoinChoose.com or CoinWarz.com (or calculated locally using daemon-reported network
difficulties and exchange APIs). NOMP's regular payment processing and miner authentication which using coin address as stratum
username will obviously not work with this coin switching feature - so you must control those with your own script as well.
* Coin-switching ports using coin-networks and crypto-exchange APIs to detect profitability. Miner's connect to these ports
with their public key which NOMP uses to derive an address for any coin needed to be paid out.
#### Attack Mitigation
@ -64,7 +91,7 @@ allow your own pool to connect upstream to a larger pool server. It will request
redistribute the work to our own connected miners.
#### Community / Support
### Community / Support
IRC
* Support / general discussion join #nomp: https://webchat.freenode.net/?channels=#nomp
* Development discussion join #nomp-dev: https://webchat.freenode.net/?channels=#nomp-dev
@ -80,9 +107,19 @@ If your pool uses NOMP let us know and we will list your website here.
* http://chunkypools.com
* http://clevermining.com
* http://rapidhash.net
* http://suchpool.pw
* http://hashfaster.com
* http://miningpoolhub.com
* http://teamdoge.com
* http://miningwith.us
* http://kryptochaos.com
* http://uberpools.org
* http://onebtcplace.com
* http://minr.es
* http://mining.theminingpools.com
* http://www.omargpools.ca/pools.html
* http://pool.trademybit.com/
* http://fixminer.com
Usage
=====
@ -93,6 +130,14 @@ Usage
* [Node.js](http://nodejs.org/) v0.10+ ([follow these installation instructions](https://github.com/joyent/node/wiki/Installing-Node.js-via-package-manager))
* [Redis](http://redis.io/) key-value store v2.6+ ([follow these instructions](http://redis.io/topics/quickstart))
##### Seriously
Those are legitimate requirements. If you use old versions of Node.js or Redis that may come with your system package manager then you will have problems. Follow the linked instructions to get the last stable versions.
[**Redis security warning**](http://redis.io/topics/security): be sure firewall access to redis - an easy way is to
include `bind 127.0.0.1` in your `redis.conf` file. Also it's a good idea to learn about and understand software that
you are using - a good place to start with redis is [data persistence](http://redis.io/topics/persistence).
#### 0) Setting up coin daemon
Follow the build/install instructions for your coin daemon. Your coin.conf file should end up looking something like this:
@ -130,10 +175,19 @@ Inside the `config_example.json` file, ensure the default configuration will wor
Explanation for each field:
````javascript
{
/* Specifies the level of log output verbosity. Anything more severy than the level specified
/* Specifies the level of log output verbosity. Anything more severe than the level specified
will also be logged. */
"logLevel": "debug", //or "warning", "error"
/* By default NOMP logs to console and gives pretty colors. If you direct that output to a
log file then disable this feature to avoid nasty characters in your log file. */
"logColors": true,
/* The NOMP CLI (command-line interface) will listen for commands on this port. For example,
blocknotify messages are sent to NOMP through this. */
"cliPort": 17117,
/* By default 'forks' is set to "auto" which will spawn one process/fork/worker for each CPU
core in your system. Each of these workers will run a separate instance of your pool(s),
and the kernel will load balance miners using these forks. Optionally, the 'forks' field
@ -142,10 +196,54 @@ Explanation for each field:
"enabled": true,
"forks": "auto"
},
/* Pool config file will inherit these default values if they are not set. */
"defaultPoolConfigs": {
/* Poll RPC daemons for new blocks every this many milliseconds. */
"blockRefreshInterval": 1000,
/* If no new blocks are available for this many seconds update and rebroadcast job. */
"jobRebroadcastTimeout": 55,
/* Disconnect workers that haven't submitted shares for this many seconds. */
"connectionTimeout": 600,
/* (For MPOS mode) Store the block hashes for shares that aren't block candidates. */
"emitInvalidBlockHashes": false,
/* This option will only authenticate miners using an address or mining key. */
"validateWorkerUsername": true,
/* Enable for client IP addresses to be detected when using a load balancer with TCP
proxy protocol enabled, such as HAProxy with 'send-proxy' param:
http://haproxy.1wt.eu/download/1.5/doc/configuration.txt */
"tcpProxyProtocol": false,
/* If under low-diff share attack we can ban their IP to reduce system/network load. If
running behind HAProxy be sure to enable 'tcpProxyProtocol', otherwise you'll end up
banning your own IP address (and therefore all workers). */
"banning": {
"enabled": true,
"time": 600, //How many seconds to ban worker for
"invalidPercent": 50, //What percent of invalid shares triggers ban
"checkThreshold": 500, //Perform check when this many shares have been submitted
"purgeInterval": 300 //Every this many seconds clear out the list of old bans
},
/* Used for storing share and block submission data and payment processing. */
"redis": {
"host": "127.0.0.1",
"port": 6379
}
},
/* This is the front-end. Its not finished. When it is finished, this comment will say so. */
"website": {
"enabled": true,
/* If you are using a reverse-proxy like nginx to display the website then set this to
127.0.0.1 to not expose the port. */
"host": "0.0.0.0",
"port": 80,
/* Used for displaying stratum connection data on the Getting Started page. */
"stratumHost": "cryppit.com",
@ -156,12 +254,7 @@ Explanation for each field:
/* How many seconds to hold onto historical stats. Currently set to 24 hours. */
"historicalRetention": 43200,
/* How many seconds worth of shares should be gathered to generate hashrate. */
"hashrateWindow": 300,
/* Redis instance of where to store historical stats. */
"redis": {
"host": "localhost",
"port": 6379
}
"hashrateWindow": 300
},
/* Not done yet. */
"adminCenter": {
@ -170,65 +263,74 @@ Explanation for each field:
}
},
/* With this enabled, the master process listen on the configured port for messages from the
'scripts/blockNotify.js' script which your coin daemons can be configured to run when a
new block is available. When a blocknotify message is received, the master process uses
IPC (inter-process communication) to notify each thread about the message. Each thread
then sends the message to the appropriate coin pool. See "Setting up blocknotify" below to
set up your daemon to use this feature. */
"blockNotifyListener": {
"enabled": true,
"port": 8117,
"password": "test"
},
/* With this enabled, the master process will listen on the configured port for messages from
the 'scripts/coinSwitch.js' script which will trigger your proxy pools to switch to the
specified coin (non-case-sensitive). This setting is used in conjuction with the proxy
feature below. */
"coinSwitchListener": {
"enabled": false,
"port": 8118,
"password": "test"
/* Redis instance of where to store global portal data such as historical stats, proxy states,
ect.. */
"redis": {
"host": "127.0.0.1",
"port": 6379
},
/* In a proxy configuration, you can setup ports that accept miners for work based on a
specific algorithm instead of a specific coin. Miners that connect to these ports are
/* With this switching configuration, you can setup ports that accept miners for work based on
a specific algorithm instead of a specific coin. Miners that connect to these ports are
automatically switched a coin determined by the server. The default coin is the first
configured pool for each algorithm and coin switching can be triggered using the
coinSwitch.js script in the scripts folder.
cli.js script in the scripts folder.
Please note miner address authentication must be disabled when using NOMP in a proxy
configuration and that payout processing is left up to the server administrator. */
"proxy": {
"sha256": {
Miners connecting to these switching ports must use their public key in the format of
RIPEMD160(SHA256(public-key)). An address for each type of coin is derived from the miner's
public key, and payments are sent to that address. */
"switching": {
"switch1": {
"enabled": false,
"port": "3333",
"diff": 10,
"varDiff": {
"minDiff": 16, //Minimum difficulty
"maxDiff": 512, //Network difficulty will be used if it is lower than this
"targetTime": 15, //Try to get 1 share per this many seconds
"retargetTime": 90, //Check to see if we should retarget every this many seconds
"variancePercent": 30 //Allow time to very this % from target without retargeting
"algorithm": "sha256",
"ports": {
"3333": {
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
}
}
}
},
"scrypt": {
"switch2": {
"enabled": false,
"port": "4444",
"diff": 10,
"varDiff": {
"minDiff": 16, //Minimum difficulty
"maxDiff": 512, //Network difficulty will be used if it is lower than this
"targetTime": 15, //Try to get 1 share per this many seconds
"retargetTime": 90, //Check to see if we should retarget every this many seconds
"variancePercent": 30 //Allow time to very this % from target without retargeting
"algorithm": "scrypt",
"ports": {
"4444": {
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
}
}
}
},
"scrypt-n": {
"switch3": {
"enabled": false,
"port": "5555"
"algorithm": "x11",
"ports": {
"5555": {
"diff": 0.001
}
}
}
},
"profitSwitch": {
"enabled": false,
"updateInterval": 600,
"depth": 0.90,
"usePoloniex": true,
"useCryptsy": true,
"useMintpal": true
}
}
````
@ -241,12 +343,22 @@ Here is an example of the required fields:
{
"name": "Litecoin",
"symbol": "ltc",
"algorithm": "scrypt", //or "sha256", "scrypt-jane", "scrypt-n", "quark", "x11"
"txMessages": false, //or true (not required, defaults to false)
"algorithm": "scrypt",
/* Magic value only required for setting up p2p block notifications. It is found in the daemon
source code as the pchMessageStart variable.
For example, litecoin mainnet magic: http://git.io/Bi8YFw
And for litecoin testnet magic: http://git.io/NXBYJA */
"peerMagic": "fbc0b6db" //optional
"peerMagicTestnet": "fcc1b7dc" //optional
//"txMessages": false, //options - defaults to false
//"mposDiffMultiplier": 256, //options - only for x11 coins in mpos mode
}
````
For additional documentation how to configure coins *(especially important for scrypt-n and scrypt-jane coins)*
For additional documentation how to configure coins and their different algorithms
see [these instructions](//github.com/zone117x/node-stratum-pool#module-usage).
@ -263,119 +375,41 @@ Description of options:
"address": "mi4iBXbBsydtcc5yFmsff2zCFVX4XG7qJc", //Address to where block rewards are given
"blockRefreshInterval": 1000, //How often to poll RPC daemons for new blocks, in milliseconds
/* Block rewards go to the configured pool wallet address to later be paid out to miners,
except for a percentage that can go to, for examples, pool operator(s) as pool fees or
or to donations address. Addresses or hashed public keys can be used. Here is an example
of rewards going to the main pool op, a pool co-owner, and NOMP donation. */
"rewardRecipients": {
"n37vuNFkXfk15uFnGoVyHZ6PYQxppD3QqK": 1.5, //1.5% goes to pool op
"mirj3LtZxbSTharhtXvotqtJXUY7ki5qfx": 0.5, //0.5% goes to a pool co-owner
/* How many milliseconds should have passed before new block transactions will trigger a new
job broadcast. */
"txRefreshInterval": 20000,
/* Some miner software is bugged and will consider the pool offline if it doesn't receive
anything for around a minute, so every time we broadcast jobs, set a timeout to rebroadcast
in this many seconds unless we find a new job. Set to zero or remove to disable this. */
"jobRebroadcastTimeout": 55,
//instanceId: 37, //Recommend not using this because a crypto-random one will be generated
/* Some attackers will create thousands of workers that use up all available socket connections,
usually the workers are zombies and don't submit shares after connecting. This feature
detects those and disconnects them. */
"connectionTimeout": 600, //Remove workers that haven't been in contact for this many seconds
/* Sometimes you want the block hashes even for shares that aren't block candidates. */
"emitInvalidBlockHashes": false,
/* We use proper maximum algorithm difficulties found in the coin daemon source code. Most
miners/pools that deal with scrypt use a guesstimated one that is about 5.86% off from the
actual one. So here we can set a tolerable threshold for if a share is slightly too low
due to mining apps using incorrect max diffs and this pool using correct max diffs. */
"shareVariancePercent": 10,
/* This determines what to do with submitted shares (and stratum worker authentication).
You have two options:
1) Enable internal and disable mpos = this portal to handle all share payments.
2) Enable mpos and disable internal = shares will be inserted into MySQL database
for MPOS to process. */
"shareProcessing": {
"internal": {
"enabled": true,
/* When workers connect, to receive payments, their address must be used as the worker
name. If this option is true, on worker authentication, their address will be
verified via a validateaddress API call to the daemon. Miners with invalid addresses
will be rejected. */
"validateWorkerAddress": true,
/* Every this many seconds get submitted blocks from redis, use daemon RPC to check
their confirmation status, if confirmed then get shares from redis that contributed
to block and send out payments. */
"paymentInterval": 30,
/* Minimum number of coins that a miner must earn before sending payment. Typically,
a higher minimum means less transactions fees (you profit more) but miners see
payments less frequently (they dislike). Opposite for a lower minimum payment. */
"minimumPayment": 0.001,
/* Minimum number of coins to keep in pool wallet. It is recommended to deposit at
at least this many coins into the pool wallet when first starting the pool. */
"minimumReserve": 10,
/* (2% default) What percent fee your pool takes from the block reward. */
"feePercent": 0.02,
/* Name of the daemon account to use when moving coin profit within daemon wallet. */
"feeCollectAccount": "feesCollected",
/* Your address that receives pool revenue from fees. */
"feeReceiveAddress": "LZz44iyF4zLCXJTU8RxztyyJZBntdS6fvv",
/* How many coins from fee revenue must accumulate on top of the
minimum reserve amount in order to trigger withdrawal to fee address. The higher
this threshold, the less of your profit goes to transactions fees. */
"feeWithdrawalThreshold": 5,
/* This daemon is used to send out payments. It MUST be for the daemon that owns the
configured 'address' that receives the block rewards, otherwise the daemon will not
be able to confirm blocks or send out payments. */
"daemon": {
"host": "localhost",
"port": 19332,
"user": "litecoinrpc",
"password": "testnet"
},
/* Redis database used for storing share and block submission data. */
"redis": {
"host": "localhost",
"port": 6379
}
},
/* Enabled mpos and shares will be inserted into share table in a MySQL database. You may
also want to use the "emitInvalidBlockHashes" option below if you require it. */
"mpos": {
"enabled": false,
"host": "localhost", //MySQL db host
"port": 3306, //MySQL db port
"user": "me", //MySQL db user
"password": "mypass", //MySQL db password
"database": "ltc", //MySQL db database name
/* For when miner's authenticate: set to "password" for both worker name and password to
be checked for in the database, set to "worker" for only work name to be checked, or
don't use this option (set to "none") for no auth checks */
"stratumAuth": "password"
}
/* 0.1% donation to NOMP. This pubkey can accept any type of coin, please leave this in
your config to help support NOMP development. */
"22851477d63a085dbc2398c8430af1c09e7343f6": 0.1
},
/* If a worker is submitting a high threshold of invalid shares we can temporarily ban them
to reduce system/network load. Also useful to fight against flooding attacks. */
"banning": {
"paymentProcessing": {
"enabled": true,
"time": 600, //How many seconds to ban worker for
"invalidPercent": 50, //What percent of invalid shares triggers ban
"checkThreshold": 500, //Check invalid percent when this many shares have been submitted
"purgeInterval": 300 //Every this many seconds clear out the list of old bans
/* Every this many seconds get submitted blocks from redis, use daemon RPC to check
their confirmation status, if confirmed then get shares from redis that contributed
to block and send out payments. */
"paymentInterval": 30,
/* Minimum number of coins that a miner must earn before sending payment. Typically,
a higher minimum means less transactions fees (you profit more) but miners see
payments less frequently (they dislike). Opposite for a lower minimum payment. */
"minimumPayment": 0.01,
/* This daemon is used to send out payments. It MUST be for the daemon that owns the
configured 'address' that receives the block rewards, otherwise the daemon will not
be able to confirm blocks or send out payments. */
"daemon": {
"host": "127.0.0.1",
"port": 19332,
"user": "testuser",
"password": "testpass"
}
},
/* Each pool can have as many ports for your miners to connect to as you wish. Each port can
@ -400,41 +434,52 @@ Description of options:
}
},
/* For redundancy, recommended to have at least two daemon instances running in case one
drops out-of-sync or offline. */
/* More than one daemon instances can be setup in case one drops out-of-sync or dies. */
"daemons": [
{ //Main daemon instance
"host": "localhost",
"host": "127.0.0.1",
"port": 19332,
"user": "litecoinrpc",
"password": "testnet"
},
{ //Backup daemon instance
"host": "localhost",
"port": 19344,
"user": "litecoinrpc",
"password": "testnet"
"user": "testuser",
"password": "testpass"
}
],
/* This allows the pool to connect to the daemon as a node peer to recieve block updates.
/* This allows the pool to connect to the daemon as a node peer to receive block updates.
It may be the most efficient way to get block updates (faster than polling, less
intensive than blocknotify script). However its still under development (not yet working). */
intensive than blocknotify script). It requires the additional field "peerMagic" in
the coin config. */
"p2p": {
"enabled": false,
"host": "localhost",
/* Host for daemon */
"host": "127.0.0.1",
/* Port configured for daemon (this is the actual peer port not RPC port) */
"port": 19333,
/* Magic value is different for main/testnet and for each coin. It is found in the daemon
source code as the pchMessageStart variable.
For example, litecoin mainnet magic: http://git.io/Bi8YFw
And for litecoin testnet magic: http://git.io/NXBYJA
*/
"magic": "fcc1b7dc",
/* If your coin daemon is new enough (i.e. not a shitcoin) then it will support a p2p
feature that prevents the daemon from spamming our peer node with unnecessary
transaction data. Assume its supported but if you have problems try disabling it. */
"disableTransactions": true
},
/* Enabled this mode and shares will be inserted into in a MySQL database. You may also want
to use the "emitInvalidBlockHashes" option below if you require it. The config options
"redis" and "paymentProcessing" will be ignored/unused if this is enabled. */
"mposMode": {
"enabled": false,
"host": "127.0.0.1", //MySQL db host
"port": 3306, //MySQL db port
"user": "me", //MySQL db user
"password": "mypass", //MySQL db password
"database": "ltc", //MySQL db database name
//Found in src as the PROTOCOL_VERSION variable, for example: http://git.io/KjuCrw
"protocolVersion": 70002,
/* Checks for valid password in database when miners connect. */
"checkPassword": true,
/* Unregistered workers can automatically be registered (added to database) on stratum
worker authentication if this is true. */
"autoCreateWorker": false
}
}
@ -451,11 +496,11 @@ For more information on these configuration options see the [pool module documen
1. In `config.json` set the port and password for `blockNotifyListener`
2. In your daemon conf file set the `blocknotify` command to use:
```
node [path to scripts/blockNotify.js] [listener host]:[listener port] [listener password] [coin name in config] %s
node [path to cli.js] [coin name in config] [block hash symbol]
```
Example: inside `dogecoin.conf` add the line
```
blocknotify="node scripts/blockNotify.js localhost:8117 mySuperSecurePassword dogecoin %s"
blocknotify=node /home/nomp/scripts/cli.js blocknotify dogecoin %s
```
Alternatively, you can use a more efficient block notify script written in pure C. Build and usage instructions
@ -479,7 +524,8 @@ output from NOMP.
#### Upgrading NOMP
When updating NOMP to the latest code its important to not only `git pull` the latest from this repo, but to also update the `node-statum-pool` module and any config files that may have been changed.
When updating NOMP to the latest code its important to not only `git pull` the latest from this repo, but to also update
the `node-statum-pool` and `node-multi-hashing` modules, and any config files that may have been changed.
* Inside your NOMP directory (where the init.js script is) do `git pull` to get the latest NOMP code.
* Remove the dependenices by deleting the `node_modules` directory with `rm -r node_modules`.
* Run `npm update` to force updating/reinstalling of the dependencies.
@ -502,11 +548,15 @@ Credits
-------
* [Jerry Brady / mintyfresh68](https://github.com/bluecircle) - got coin-switching fully working and developed proxy-per-algo feature
* [Tony Dobbs](http://anthonydobbs.com) - designs for front-end and created the NOMP logo
* [LucasJones](//github.com/LucasJones) - got p2p block notify working and implemented additional hashing algos
* [vekexasia](//github.com/vekexasia) - co-developer & great tester
* [TheSeven](//github.com/TheSeven) - answering an absurd amount of my questions and being a very helpful gentleman
* [UdjinM6](//github.com/UdjinM6) - helped implement fee withdrawal in payment processing
* [Alex Petrov / sysmanalex](https://github.com/sysmanalex) - contributed the pure C block notify script
* Those that contributed to [node-stratum-pool](//github.com/zone117x/node-stratum-pool)
* [svirusxxx](//github.com/svirusxxx) - sponsored development of MPOS mode
* [icecube45](//github.com/icecube45) - helping out with the repo wiki
* [Fcases](//github.com/Fcases) - ordered me a pizza <3
* Those that contributed to [node-stratum-pool](//github.com/zone117x/node-stratum-pool#credits)
License

5
coins/21coin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "21coin",
"symbol": "21",
"algorithm": "sha256"
}

5
coins/arkenstone.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Arkenstone",
"symbol": "ARS",
"algorithm": "sha256"
}

5
coins/arkhash.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Arkhash",
"symbol": "ARK",
"algorithm": "sha256"
}

5
coins/asiccoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "ASICcoin",
"symbol": "ASC",
"algorithm": "sha256"
}

7
coins/battlecoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Battlecoin",
"symbol": "BCX",
"algorithm": "sha256",
"peerMagic": "03e803e4",
"peerMagicTestnet": "cdf2c0ef"
}

5
coins/benjamins.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Benjamins",
"symbol": "BEN",
"algorithm": "sha256"
}

5
coins/betacoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Betacoin",
"symbol": "BET",
"algorithm": "sha256"
}

5
coins/bitraam.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "BitRaam",
"symbol": "BRM",
"algorithm": "sha256"
}

7
coins/bitstar.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "bitstar",
"symbol": "bits",
"algorithm": "scrypt",
"peerMagic": "cef1dbfa",
"peerMagicTestnet": "cdf1c0ef"
}

7
coins/bluecoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "bluecoin",
"symbol": "blu",
"algorithm": "scrypt",
"peerMagic": "fef5abaa",
"peerMagicTestnet": "eaceedcd"
}

5
coins/bytecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Bytecoin",
"symbol": "BTE",
"algorithm": "sha256"
}

5
coins/continuumcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Continuumcoin",
"symbol": "CTM",
"algorithm": "sha256"
}

7
coins/copperlark.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Copperlark",
"symbol": "CLR",
"algorithm": "keccak",
"normalHashing": true,
"diffShift": 32
}

8
coins/cryptometh.json Normal file
View File

@ -0,0 +1,8 @@
{
"name": "Cryptometh",
"symbol": "METH",
"algorithm": "keccak",
"peerMagic": "2bf2ed4f" ,
"peerMagicTestNet": "b28cfda7"
}

View File

@ -1,5 +1,6 @@
{
"name": "Darkcoin",
"symbol": "DRK",
"algorithm": "x11"
}
"algorithm": "x11",
"mposDiffMultiplier": 256
}

5
coins/defcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Defcoin",
"symbol": "DEF",
"algorithm": "scrypt"
}

5
coins/devcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Devcoin",
"symbol": "DVC",
"algorithm": "sha256"
}

View File

@ -1,5 +1,7 @@
{
"name": "Dogecoin",
"symbol": "DOGE",
"algorithm": "scrypt"
}
"algorithm": "scrypt",
"peerMagic": "c0c0c0c0",
"peerMagicTestnet": "fcc1b7dc"
}

5
coins/einsteinium.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Einsteinium",
"symbol": "EMC2",
"algorithm": "scrypt"
}

5
coins/emark.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "eMark",
"symbol": "DEM",
"algorithm": "sha256"
}

View File

@ -1,5 +1,6 @@
{
"name": "Fastcoin",
"symbol": "FST",
"algorithm": "scrypt"
"algorithm": "scrypt",
"peerMagic": "fbc0b6db"
}

5
coins/fastcoinsha.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Fastcoinsha",
"symbol": "FSS",
"algorithm": "sha256"
}

5
coins/feathercoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Feathercoin",
"symbol": "FTC",
"algorithm": "scrypt"
}

5
coins/fedoracoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "FedoraCoin",
"symbol": "TiPS",
"algorithm": "scrypt"
}

5
coins/fireflycoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Fireflycoin",
"symbol": "FFC",
"algorithm": "sha256"
}

5
coins/freicoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Freicoin",
"symbol": "FRC",
"algorithm": "sha256"
}

5
coins/globalboost.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "GlobalBoost",
"symbol": "BST",
"algorithm": "scrypt"
}

7
coins/globalcoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Globalcoin",
"symbol": "GLC",
"algorithm": "scrypt",
"peerMagic": "fcd9b7dd",
"peerMagicTestnet": "fbc0b8db"
}

View File

@ -1,5 +1,7 @@
{
"name": "GlobalDenomination",
"symbol": "GDN",
"algorithm": "x11"
"algorithm": "x11",
"peerMagic": "fec3b9de",
"peerMagicTestnet": "fec4bade"
}

7
coins/grandcoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Grandcoin",
"symbol": "GDC",
"algorithm": "scrypt",
"peerMagic": "fdc1a5db",
"txMessages": true
}

5
coins/groestlcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "GroestlCoin",
"symbol": "GRS",
"algorithm": "groestl"
}

View File

@ -1,5 +1,6 @@
{
"name": "Hirocoin",
"symbol": "hic",
"algorithm": "x11"
}
"symbol": "HIRO",
"algorithm": "x11",
"mposDiffMultiplier": 256
}

5
coins/ixcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Ixcoin",
"symbol": "IXC",
"algorithm": "sha256"
}

5
coins/jennycoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Jennycoin",
"symbol": "JNY",
"algorithm": "scrypt"
}

5
coins/joulecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Joulecoin",
"symbol": "XJO",
"algorithm": "sha256"
}

5
coins/klondikecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Klondikecoin",
"symbol": "KDC",
"algorithm": "scrypt"
}

6
coins/kumacoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Kumacoin",
"symbol": "KUMA",
"algorithm": "quark",
"mposDiffMultiplier": 256
}

View File

@ -1,5 +1,7 @@
{
"name": "Litecoin",
"symbol": "LTC",
"algorithm": "scrypt"
"algorithm": "scrypt",
"peerMagic": "fbc0b6db",
"peerMagicTestnet": "fcc1b7dc"
}

View File

@ -1,5 +1,7 @@
{
"name": "Lottocoin",
"symbol": "LOT",
"algorithm": "scrypt"
"algorithm": "scrypt",
"peerMagic": "a5fdb6c1",
"peerMagicTestnet": "fdc3b6f1"
}

View File

@ -1,5 +1,8 @@
{
"name": "Maxcoin",
"symbol": "MAX",
"algorithm": "keccak"
}
"algorithm": "keccak",
"peerMagic": "f9bebbd2",
"peerMagicTestNet": "0b11bb07"
}

5
coins/mazacoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Mazacoin",
"symbol": "MZC",
"algorithm": "sha256"
}

5
coins/mintcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Mintcoin",
"symbol": "MINT",
"algorithm": "scrypt"
}

5
coins/monacoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Monacoin",
"symbol": "MONA",
"algorithm": "scrypt"
}

6
coins/muniti.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Muniti",
"symbol": "MUN",
"algorithm": "x11",
"mposDiffMultiplier": 256
}

5
coins/myriadcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Myriadcoin",
"symbol": "MYR",
"algorithm": "scrypt"
}

7
coins/octocoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Octocoin",
"symbol": "888",
"algorithm": "scrypt",
"peerMagic": "fbc0b6db",
"peerMagicTestnet": "fcc1b7dc"
}

View File

@ -0,0 +1,6 @@
{
"name": "OpenSourcecoin",
"symbol": "OSC",
"algorithm": "sha256",
"txMessages" : true
}

7
coins/pawncoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "pawncoin",
"symbol": "pawn",
"algorithm": "scrypt",
"peerMagic": "fcc1b7dc",
"peerMagicTestnet": "c0c0c0c0"
}

7
coins/plncoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "plncoin",
"symbol": "plnc",
"algorithm": "scrypt",
"peerMagic": "fbc0b6db",
"peerMagicTestnet": "fcc1b7dc"
}

5
coins/potcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Potcoin",
"symbol": "POT",
"algorithm": "scrypt"
}

5
coins/procoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Procoin",
"symbol": "PCN",
"algorithm": "scrypt"
}

View File

@ -1,5 +1,6 @@
{
"name": "Quarkcoin",
"symbol": "QRK",
"algorithm": "quark"
}
"algorithm": "quark",
"mposDiffMultiplier": 256
}

5
coins/ronpaulcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "RonPaulCoin",
"symbol": "RPC",
"algorithm": "scrypt"
}

5
coins/rubycoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Rubycoin",
"symbol": "RUBY",
"algorithm": "scrypt"
}

7
coins/saffroncoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "saffroncoin",
"symbol": "SFR",
"algorithm": "scrypt",
"peerMagic": "cf0567ea",
"peerMagicTestnet": "01f555a4"
}

5
coins/sayacoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Sayacoin",
"symbol": "SYC",
"algorithm": "sha256"
}

5
coins/sha1coin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Sha1coin",
"symbol": "SHA",
"algorithm": "sha1coin"
}

5
coins/spartancoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Spartancoin",
"symbol": "SPN",
"algorithm": "scrypt"
}

6
coins/starcoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Starcoin",
"symbol": "STR",
"algorithm": "scrypt",
"peerMagic": "e4e8effd"
}

5
coins/stashcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Stashcoin",
"symbol": "STA",
"algorithm": "sha256"
}

5
coins/stoopidcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Stoopidcoin",
"symbol": "STP",
"algorithm": "scrypt"
}

7
coins/suncoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "Suncoin",
"symbol": "SUN",
"algorithm": "scrypt",
"peerMagic":"fcd9b7dd",
"peerMagicTestnet":"fbc0b8db"
}

5
coins/takcoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Takcoin",
"symbol": "TAK",
"algorithm": "sha256"
}

5
coins/teacoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Teacoin",
"symbol": "TEA",
"algorithm": "sha256"
}

6
coins/tekcoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Tekcoin",
"symbol": "TEK",
"algorithm": "sha256",
"txMessages": "true"
}

5
coins/terracoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Terracoin",
"symbol": "TRC",
"algorithm": "sha256"
}

5
coins/tigercoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Tigercoin",
"symbol": "TGC",
"algorithm": "sha256"
}

6
coins/ultimatecoin.json Normal file
View File

@ -0,0 +1,6 @@
{
"name": "Ultimatecoin",
"symbol": "ULT",
"algorithm": "scrypt",
"peerMagic": "f9f7c0e8"
}

5
coins/unobtanium.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Unobtanium",
"symbol": "UNO",
"algorithm": "sha256"
}

View File

@ -1,5 +1,7 @@
{
"name": "Vertcoin",
"symbol": "VTC",
"algorithm": "scrypt-n"
"algorithm": "scrypt-n",
"peerMagic": "fabfb5da",
"peerMagicTestnet": "76657274"
}

5
coins/wearesatoshi.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "WeAreSatoshi",
"symbol": "WAS",
"algorithm": "sha256"
}

5
coins/whitecoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Whitecoin",
"symbol": "WC",
"algorithm": "scrypt"
}

7
coins/zedcoin.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "zedcoin",
"symbol": "zed",
"algorithm": "scrypt",
"peerMagic": "c0dbf1fd",
"peerMagicTestnet": "fdc2b6f1"
}

5
coins/zetacoin.json Normal file
View File

@ -0,0 +1,5 @@
{
"name": "Zetacoin",
"symbol": "ZTC",
"algorithm": "sha256"
}

View File

@ -1,23 +1,43 @@
{
"logLevel": "debug",
"logColors": true,
"cliPort": 17117,
"clustering": {
"enabled": true,
"forks": "auto"
},
"defaultPoolConfigs": {
"blockRefreshInterval": 1000,
"jobRebroadcastTimeout": 55,
"connectionTimeout": 600,
"emitInvalidBlockHashes": false,
"validateWorkerUsername": true,
"tcpProxyProtocol": false,
"banning": {
"enabled": true,
"time": 600,
"invalidPercent": 50,
"checkThreshold": 500,
"purgeInterval": 300
},
"redis": {
"host": "127.0.0.1",
"port": 6379
}
},
"website": {
"enabled": true,
"host": "0.0.0.0",
"port": 80,
"stratumHost": "cryppit.com",
"stats": {
"updateInterval": 15,
"updateInterval": 60,
"historicalRetention": 43200,
"hashrateWindow": 300,
"redis": {
"host": "localhost",
"port": 6379
}
"hashrateWindow": 300
},
"adminCenter": {
"enabled": true,
@ -25,53 +45,69 @@
}
},
"blockNotifyListener": {
"enabled": false,
"port": 8117,
"password": "test"
"redis": {
"host": "127.0.0.1",
"port": 6379
},
"coinSwitchListener": {
"enabled": false,
"port": 8118,
"password": "test"
},
"proxy": {
"sha256": {
"switching": {
"switch1": {
"enabled": false,
"port": "3333",
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
"algorithm": "sha256",
"ports": {
"3333": {
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
}
}
}
},
"scrypt": {
"switch2": {
"enabled": false,
"port": "4444",
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
"algorithm": "scrypt",
"ports": {
"4444": {
"diff": 10,
"varDiff": {
"minDiff": 16,
"maxDiff": 512,
"targetTime": 15,
"retargetTime": 90,
"variancePercent": 30
}
}
}
},
"scrypt-n": {
"switch3": {
"enabled": false,
"port": "5555"
"algorithm": "x11",
"ports": {
"5555": {
"diff": 0.001,
"varDiff": {
"minDiff": 0.001,
"maxDiff": 1,
"targetTime": 15,
"retargetTime": 60,
"variancePercent": 30
}
}
}
}
},
"redisBlockNotifyListener": {
"profitSwitch": {
"enabled": false,
"redisPort": 6379,
"redisHost": "hostname",
"psubscribeKey": "newblocks:*"
"updateInterval": 600,
"depth": 0.90,
"usePoloniex": true,
"useCryptsy": true,
"useMintpal": true,
"useBittrex": true
}
}
}

311
init.js
View File

@ -4,14 +4,14 @@ var os = require('os');
var cluster = require('cluster');
var async = require('async');
var extend = require('extend');
var PoolLogger = require('./libs/logUtil.js');
var BlocknotifyListener = require('./libs/blocknotifyListener.js');
var CoinswitchListener = require('./libs/coinswitchListener.js');
var RedisBlocknotifyListener = require('./libs/redisblocknotifyListener.js');
var WorkerListener = require('./libs/workerListener.js');
var CliListener = require('./libs/cliListener.js');
var PoolWorker = require('./libs/poolWorker.js');
var PaymentProcessor = require('./libs/paymentProcessor.js');
var Website = require('./libs/website.js');
var ProfitSwitch = require('./libs/profitSwitch.js');
var algos = require('stratum-pool/lib/algoProperties.js');
@ -23,10 +23,12 @@ if (!fs.existsSync('config.json')){
}
var portalConfig = JSON.parse(JSON.minify(fs.readFileSync("config.json", {encoding: 'utf8'})));
var poolConfigs;
var logger = new PoolLogger({
logLevel: portalConfig.logLevel
logLevel: portalConfig.logLevel,
logColors: portalConfig.logColors
});
@ -49,6 +51,15 @@ try{
if (cluster.isMaster)
logger.warning('POSIX', 'Connection Limit', '(Safe to ignore) Must be ran as root to increase resource limits');
}
finally {
// Find out which user used sudo through the environment variable
var uid = parseInt(process.env.SUDO_UID);
// Set our server's uid to that user
if (uid) {
process.setuid(uid);
logger.debug('POSIX', 'Connection Limit', 'Raised to 100K concurrent connections, now running as non-root user: ' + process.getuid());
}
}
}
catch(e){
if (cluster.isMaster)
@ -56,9 +67,8 @@ catch(e){
}
if (cluster.isWorker){
switch(process.env.workerType){
case 'pool':
new PoolWorker(logger);
@ -69,6 +79,9 @@ if (cluster.isWorker){
case 'website':
new Website(logger);
break;
case 'profitSwitch':
new ProfitSwitch(logger);
break;
}
return;
@ -79,18 +92,82 @@ if (cluster.isWorker){
var buildPoolConfigs = function(){
var configs = {};
var configDir = 'pool_configs/';
var poolConfigFiles = [];
/* Get filenames of pool config json files that are enabled */
fs.readdirSync(configDir).forEach(function(file){
if (!fs.existsSync(configDir + file) || path.extname(configDir + file) !== '.json') return;
var poolOptions = JSON.parse(JSON.minify(fs.readFileSync(configDir + file, {encoding: 'utf8'})));
if (!poolOptions.enabled) return;
var coinFilePath = 'coins/' + poolOptions.coin;
poolOptions.fileName = file;
poolConfigFiles.push(poolOptions);
});
/* Ensure no pool uses any of the same ports as another pool */
for (var i = 0; i < poolConfigFiles.length; i++){
var ports = Object.keys(poolConfigFiles[i].ports);
for (var f = 0; f < poolConfigFiles.length; f++){
if (f === i) continue;
var portsF = Object.keys(poolConfigFiles[f].ports);
for (var g = 0; g < portsF.length; g++){
if (ports.indexOf(portsF[g]) !== -1){
logger.error('Master', poolConfigFiles[f].fileName, 'Has same configured port of ' + portsF[g] + ' as ' + poolConfigFiles[i].fileName);
process.exit(1);
return;
}
}
if (poolConfigFiles[f].coin === poolConfigFiles[i].coin){
logger.error('Master', poolConfigFiles[f].fileName, 'Pool has same configured coin file coins/' + poolConfigFiles[f].coin + ' as ' + poolConfigFiles[i].fileName + ' pool');
process.exit(1);
return;
}
}
}
poolConfigFiles.forEach(function(poolOptions){
poolOptions.coinFileName = poolOptions.coin;
var coinFilePath = 'coins/' + poolOptions.coinFileName;
if (!fs.existsSync(coinFilePath)){
logger.error('Master', poolOptions.coin, 'could not find file: ' + coinFilePath);
logger.error('Master', poolOptions.coinFileName, 'could not find file: ' + coinFilePath);
return;
}
var coinProfile = JSON.parse(JSON.minify(fs.readFileSync(coinFilePath, {encoding: 'utf8'})));
poolOptions.coin = coinProfile;
poolOptions.coin.name = poolOptions.coin.name.toLowerCase();
if (poolOptions.coin.name in configs){
logger.error('Master', poolOptions.fileName, 'coins/' + poolOptions.coinFileName
+ ' has same configured coin name ' + poolOptions.coin.name + ' as coins/'
+ configs[poolOptions.coin.name].coinFileName + ' used by pool config '
+ configs[poolOptions.coin.name].fileName);
process.exit(1);
return;
}
for (var option in portalConfig.defaultPoolConfigs){
if (!(option in poolOptions)){
var toCloneOption = portalConfig.defaultPoolConfigs[option];
var clonedOption = {};
if (toCloneOption.constructor === Object)
extend(true, clonedOption, toCloneOption);
else
clonedOption = toCloneOption;
poolOptions[option] = clonedOption;
}
}
configs[poolOptions.coin.name] = poolOptions;
if (!(coinProfile.algorithm in algos)){
@ -104,17 +181,10 @@ var buildPoolConfigs = function(){
var spawnPoolWorkers = function(portalConfig, poolConfigs){
var spawnPoolWorkers = function(){
Object.keys(poolConfigs).forEach(function(coin){
var p = poolConfigs[coin];
var internalEnabled = p.shareProcessing && p.shareProcessing.internal && p.shareProcessing.internal.enabled;
var mposEnabled = p.shareProcessing && p.shareProcessing.mpos && p.shareProcessing.mpos.enabled;
if (!internalEnabled && !mposEnabled){
logger.error('Master', coin, 'Share processing is not configured so a pool cannot be started for this coin.');
delete poolConfigs[coin];
}
if (!Array.isArray(p.daemons) || p.daemons.length < 1){
logger.error('Master', coin, 'No daemons configured so a pool cannot be started for this coin.');
@ -127,6 +197,7 @@ var spawnPoolWorkers = function(portalConfig, poolConfigs){
return;
}
var serializedConfigs = JSON.stringify(poolConfigs);
var numForks = (function(){
@ -139,6 +210,7 @@ var spawnPoolWorkers = function(portalConfig, poolConfigs){
return portalConfig.clustering.forks;
})();
var poolWorkers = {};
var createPoolWorker = function(forkId){
var worker = cluster.fork({
@ -147,11 +219,24 @@ var spawnPoolWorkers = function(portalConfig, poolConfigs){
pools: serializedConfigs,
portalConfig: JSON.stringify(portalConfig)
});
worker.forkId = forkId;
worker.type = 'pool';
poolWorkers[forkId] = worker;
worker.on('exit', function(code, signal){
logger.error('Master', 'PoolSpanwer', 'Fork ' + forkId + ' died, spawning replacement worker...');
logger.error('Master', 'PoolSpawner', 'Fork ' + forkId + ' died, spawning replacement worker...');
setTimeout(function(){
createPoolWorker(forkId);
}, 2000);
}).on('message', function(msg){
switch(msg.type){
case 'banIP':
Object.keys(cluster.workers).forEach(function(id) {
if (cluster.workers[id].type === 'pool'){
cluster.workers[id].send({type: 'banIP', ip: msg.ip});
}
});
break;
}
});
};
@ -168,85 +253,117 @@ var spawnPoolWorkers = function(portalConfig, poolConfigs){
};
var startWorkerListener = function(poolConfigs){
var workerListener = new WorkerListener(logger, poolConfigs);
workerListener.init();
};
var startCliListener = function(){
var cliPort = portalConfig.cliPort;
var startBlockListener = function(portalConfig){
//block notify options
//setup block notify here and use IPC to tell appropriate pools
var listener = new BlocknotifyListener(portalConfig.blockNotifyListener);
var listener = new CliListener(cliPort);
listener.on('log', function(text){
logger.debug('Master', 'Blocknotify', text);
});
listener.on('hash', function(message){
logger.debug('Master', 'CLI', text);
}).on('command', function(command, params, options, reply){
var ipcMessage = {type:'blocknotify', coin: message.coin, hash: message.hash};
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send(ipcMessage);
});
});
listener.start();
switch(command){
case 'blocknotify':
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send({type: 'blocknotify', coin: params[0], hash: params[1]});
});
reply('Pool workers notified');
break;
case 'coinswitch':
processCoinSwitchCommand(params, options, reply);
break;
case 'reloadpool':
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send({type: 'reloadpool', coin: params[0] });
});
reply('reloaded pool ' + params[0]);
break;
default:
reply('unrecognized command "' + command + '"');
break;
}
}).start();
};
//
// Receives authenticated events from coin switch listener and triggers proxy
// to swtich to a new coin.
//
var startCoinswitchListener = function(portalConfig){
var listener = new CoinswitchListener(portalConfig.coinSwitchListener);
listener.on('log', function(text){
logger.debug('Master', 'Coinswitch', text);
});
listener.on('switchcoin', function(message){
var processCoinSwitchCommand = function(params, options, reply){
var ipcMessage = {type:'blocknotify', coin: message.coin, hash: message.hash};
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send(ipcMessage);
var logSystem = 'CLI';
var logComponent = 'coinswitch';
var replyError = function(msg){
reply(msg);
logger.error(logSystem, logComponent, msg);
};
if (!params[0]) {
replyError('Coin name required');
return;
}
if (!params[1] && !options.algorithm){
replyError('If switch key is not provided then algorithm options must be specified');
return;
}
else if (params[1] && !portalConfig.switching[params[1]]){
replyError('Switch key not recognized: ' + params[1]);
return;
}
else if (options.algorithm && !Object.keys(portalConfig.switching).filter(function(s){
return portalConfig.switching[s].algorithm === options.algorithm;
})[0]){
replyError('No switching options contain the algorithm ' + options.algorithm);
return;
}
var messageCoin = params[0].toLowerCase();
var newCoin = Object.keys(poolConfigs).filter(function(p){
return p.toLowerCase() === messageCoin;
})[0];
if (!newCoin){
replyError('Switch message to coin that is not recognized: ' + messageCoin);
return;
}
var switchNames = [];
if (params[1]) {
switchNames.push(params[1]);
}
else{
for (var name in portalConfig.switching){
if (portalConfig.switching[name].enabled && portalConfig.switching[name].algorithm === options.algorithm)
switchNames.push(name);
}
}
switchNames.forEach(function(name){
if (poolConfigs[newCoin].coin.algorithm !== portalConfig.switching[name].algorithm){
replyError('Cannot switch a '
+ portalConfig.switching[name].algorithm
+ ' algo pool to coin ' + newCoin + ' with ' + poolConfigs[newCoin].coin.algorithm + ' algo');
return;
}
Object.keys(cluster.workers).forEach(function (id) {
cluster.workers[id].send({type: 'coinswitch', coin: newCoin, switchName: name });
});
var ipcMessage = {
type:'switch',
coin: message.coin
};
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send(ipcMessage);
});
});
listener.start();
reply('Switch message sent to pool workers');
};
var startRedisBlockListener = function(portalConfig){
//block notify options
//setup block notify here and use IPC to tell appropriate pools
if (!portalConfig.redisBlockNotifyListener.enabled) return;
var listener = new RedisBlocknotifyListener(portalConfig.redisBlockNotifyListener);
listener.on('log', function(text){
logger.debug('Master', 'blocknotify', text);
}).on('hash', function (message) {
var ipcMessage = {type:'blocknotify', coin: message.coin, hash: message.hash};
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].send(ipcMessage);
});
});
listener.start();
};
var startPaymentProcessor = function(poolConfigs){
var startPaymentProcessor = function(){
var enabledForAny = false;
for (var pool in poolConfigs){
var p = poolConfigs[pool];
var enabled = p.enabled && p.shareProcessing && p.shareProcessing.internal && p.shareProcessing.internal.enabled;
var enabled = p.enabled && p.paymentProcessing && p.paymentProcessing.enabled;
if (enabled){
enabledForAny = true;
break;
@ -269,7 +386,7 @@ var startPaymentProcessor = function(poolConfigs){
};
var startWebsite = function(portalConfig, poolConfigs){
var startWebsite = function(){
if (!portalConfig.website.enabled) return;
@ -287,22 +404,40 @@ var startWebsite = function(portalConfig, poolConfigs){
};
var startProfitSwitch = function(){
if (!portalConfig.profitSwitch || !portalConfig.profitSwitch.enabled){
//logger.error('Master', 'Profit', 'Profit auto switching disabled');
return;
}
var worker = cluster.fork({
workerType: 'profitSwitch',
pools: JSON.stringify(poolConfigs),
portalConfig: JSON.stringify(portalConfig)
});
worker.on('exit', function(code, signal){
logger.error('Master', 'Profit', 'Profit switching process died, spawning replacement...');
setTimeout(function(){
startWebsite(portalConfig, poolConfigs);
}, 2000);
});
};
(function init(){
var poolConfigs = buildPoolConfigs();
poolConfigs = buildPoolConfigs();
spawnPoolWorkers(portalConfig, poolConfigs);
spawnPoolWorkers();
startPaymentProcessor(poolConfigs);
startPaymentProcessor();
startBlockListener(portalConfig);
startWebsite();
startCoinswitchListener(portalConfig);
startProfitSwitch();
startRedisBlockListener(portalConfig);
startWorkerListener(poolConfigs);
startWebsite(portalConfig, poolConfigs);
startCliListener();
})();

View File

@ -18,8 +18,7 @@ module.exports = function(logger, portalConfig, poolConfigs){
res.end(portalStats.statsString);
return;
case 'pool_stats':
res.writeHead(200, {'content-encoding': 'gzip'});
res.end(portalStats.statPoolHistoryBuffer);
res.end(JSON.stringify(portalStats.statPoolHistory));
return;
case 'live_stats':
res.writeHead(200, {

222
libs/apiBittrex.js Normal file
View File

@ -0,0 +1,222 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.1.0',
PUBLIC_API_URL = 'https://bittrex.com/api/v1/public',
PRIVATE_API_URL = 'https://bittrex.com/api/v1/market',
USER_AGENT = 'nomp/node-open-mining-portal'
// Constructor
function Bittrex(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'Bittrex: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Bittrex.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '-' + currencyB;
}
// Prototype
Bittrex.prototype = {
constructor: Bittrex,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = Bittrex.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
// Make a private API request
_private: function(parameters, callback){
var options;
parameters.nonce = nonce();
options = {
method: 'POST',
url: PRIVATE_API_URL,
form: parameters,
headers: this._getPrivateHeaders(parameters)
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL + '/getmarketsummaries',
qs: null
};
return this._request(options, callback);
},
// getBuyOrderBook: function(currencyA, currencyB, callback){
// var options = {
// method: 'GET',
// url: PUBLIC_API_URL + '/orders/' + currencyB + '/' + currencyA + '/BUY',
// qs: null
// };
// return this._request(options, callback);
// },
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
market: joinCurrencies(currencyA, currencyB),
type: 'buy',
depth: '50'
}
var options = {
method: 'GET',
url: PUBLIC_API_URL + '/getorderbook',
qs: parameters
}
return this._request(options, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
/////
// PRIVATE METHODS
myBalances: function(callback){
var parameters = {
command: 'returnBalances'
};
return this._private(parameters, callback);
},
myOpenOrders: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOpenOrders',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
myTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
buy: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'buy',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
sell: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'sell',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
cancelOrder: function(currencyA, currencyB, orderNumber, callback){
var parameters = {
command: 'cancelOrder',
currencyPair: joinCurrencies(currencyA, currencyB),
orderNumber: orderNumber
};
return this._private(parameters, callback);
},
withdraw: function(currency, amount, address, callback){
var parameters = {
command: 'withdraw',
currency: currency,
amount: amount,
address: address
};
return this._private(parameters, callback);
}
};
return Bittrex;
}();

115
libs/apiCoinWarz.js Normal file
View File

@ -0,0 +1,115 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.0.1',
PUBLIC_API_URL = 'http://www.coinwarz.com/v1/api/profitability/?apikey=YOUR_API_KEY&algo=all',
USER_AGENT = 'nomp/node-open-mining-portal'
// Constructor
function Cryptsy(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'CoinWarz: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Cryptsy.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '_' + currencyB;
}
// Prototype
CoinWarz.prototype = {
constructor: CoinWarz,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = CoinWarz.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var parameters = {
method: 'marketdatav2'
};
return this._public(parameters, callback);
},
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOrderBook',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
////
return CoinWarz;
}();

204
libs/apiCryptsy.js Normal file
View File

@ -0,0 +1,204 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.1.0',
PUBLIC_API_URL = 'http://pubapi.cryptsy.com/api.php',
PRIVATE_API_URL = 'https://api.cryptsy.com/api',
USER_AGENT = 'nomp/node-open-mining-portal'
// Constructor
function Cryptsy(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'Cryptsy: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Cryptsy.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '_' + currencyB;
}
// Prototype
Cryptsy.prototype = {
constructor: Cryptsy,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = Cryptsy.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
// Make a private API request
_private: function(parameters, callback){
var options;
parameters.nonce = nonce();
options = {
method: 'POST',
url: PRIVATE_API_URL,
form: parameters,
headers: this._getPrivateHeaders(parameters)
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var parameters = {
method: 'marketdatav2'
};
return this._public(parameters, callback);
},
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOrderBook',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
/////
// PRIVATE METHODS
myBalances: function(callback){
var parameters = {
command: 'returnBalances'
};
return this._private(parameters, callback);
},
myOpenOrders: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOpenOrders',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
myTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
buy: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'buy',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
sell: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'sell',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
cancelOrder: function(currencyA, currencyB, orderNumber, callback){
var parameters = {
command: 'cancelOrder',
currencyPair: joinCurrencies(currencyA, currencyB),
orderNumber: orderNumber
};
return this._private(parameters, callback);
},
withdraw: function(currency, amount, address, callback){
var parameters = {
command: 'withdraw',
currency: currency,
amount: amount,
address: address
};
return this._private(parameters, callback);
}
};
return Cryptsy;
}();

216
libs/apiMintpal.js Normal file
View File

@ -0,0 +1,216 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.1.0',
PUBLIC_API_URL = 'https://api.mintpal.com/v2/market',
PRIVATE_API_URL = 'https://api.mintpal.com/v2/market',
USER_AGENT = 'nomp/node-open-mining-portal'
// Constructor
function Mintpal(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'Mintpal: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Mintpal.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '_' + currencyB;
}
// Prototype
Mintpal.prototype = {
constructor: Mintpal,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = Mintpal.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
// Make a private API request
_private: function(parameters, callback){
var options;
parameters.nonce = nonce();
options = {
method: 'POST',
url: PRIVATE_API_URL,
form: parameters,
headers: this._getPrivateHeaders(parameters)
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL + '/summary',
qs: null
};
return this._request(options, callback);
},
getBuyOrderBook: function(currencyA, currencyB, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL + '/orders/' + currencyB + '/' + currencyA + '/BUY',
qs: null
};
return this._request(options, callback);
},
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOrderBook',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
/////
// PRIVATE METHODS
myBalances: function(callback){
var parameters = {
command: 'returnBalances'
};
return this._private(parameters, callback);
},
myOpenOrders: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOpenOrders',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
myTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
buy: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'buy',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
sell: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'sell',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
cancelOrder: function(currencyA, currencyB, orderNumber, callback){
var parameters = {
command: 'cancelOrder',
currencyPair: joinCurrencies(currencyA, currencyB),
orderNumber: orderNumber
};
return this._private(parameters, callback);
},
withdraw: function(currency, amount, address, callback){
var parameters = {
command: 'withdraw',
currency: currency,
amount: amount,
address: address
};
return this._private(parameters, callback);
}
};
return Mintpal;
}();

212
libs/apiPoloniex.js Normal file
View File

@ -0,0 +1,212 @@
var request = require('request');
var nonce = require('nonce');
module.exports = function() {
'use strict';
// Module dependencies
// Constants
var version = '0.1.0',
PUBLIC_API_URL = 'https://poloniex.com/public',
PRIVATE_API_URL = 'https://poloniex.com/tradingApi',
USER_AGENT = 'npm-crypto-apis/' + version
// Constructor
function Poloniex(key, secret){
// Generate headers signed by this user's key and secret.
// The secret is encapsulated and never exposed
this._getPrivateHeaders = function(parameters){
var paramString, signature;
if (!key || !secret){
throw 'Poloniex: Error. API key and secret required';
}
// Sort parameters alphabetically and convert to `arg1=foo&arg2=bar`
paramString = Object.keys(parameters).sort().map(function(param){
return encodeURIComponent(param) + '=' + encodeURIComponent(parameters[param]);
}).join('&');
signature = crypto.createHmac('sha512', secret).update(paramString).digest('hex');
return {
Key: key,
Sign: signature
};
};
}
// If a site uses non-trusted SSL certificates, set this value to false
Poloniex.STRICT_SSL = true;
// Helper methods
function joinCurrencies(currencyA, currencyB){
return currencyA + '_' + currencyB;
}
// Prototype
Poloniex.prototype = {
constructor: Poloniex,
// Make an API request
_request: function(options, callback){
if (!('headers' in options)){
options.headers = {};
}
options.headers['User-Agent'] = USER_AGENT;
options.json = true;
options.strictSSL = Poloniex.STRICT_SSL;
request(options, function(err, response, body) {
callback(err, body);
});
return this;
},
// Make a public API request
_public: function(parameters, callback){
var options = {
method: 'GET',
url: PUBLIC_API_URL,
qs: parameters
};
return this._request(options, callback);
},
// Make a private API request
_private: function(parameters, callback){
var options;
parameters.nonce = nonce();
options = {
method: 'POST',
url: PRIVATE_API_URL,
form: parameters,
headers: this._getPrivateHeaders(parameters)
};
return this._request(options, callback);
},
/////
// PUBLIC METHODS
getTicker: function(callback){
var parameters = {
command: 'returnTicker'
};
return this._public(parameters, callback);
},
get24hVolume: function(callback){
var parameters = {
command: 'return24hVolume'
};
return this._public(parameters, callback);
},
getOrderBook: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOrderBook',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
getTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._public(parameters, callback);
},
/////
// PRIVATE METHODS
myBalances: function(callback){
var parameters = {
command: 'returnBalances'
};
return this._private(parameters, callback);
},
myOpenOrders: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnOpenOrders',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
myTradeHistory: function(currencyA, currencyB, callback){
var parameters = {
command: 'returnTradeHistory',
currencyPair: joinCurrencies(currencyA, currencyB)
};
return this._private(parameters, callback);
},
buy: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'buy',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
sell: function(currencyA, currencyB, rate, amount, callback){
var parameters = {
command: 'sell',
currencyPair: joinCurrencies(currencyA, currencyB),
rate: rate,
amount: amount
};
return this._private(parameters, callback);
},
cancelOrder: function(currencyA, currencyB, orderNumber, callback){
var parameters = {
command: 'cancelOrder',
currencyPair: joinCurrencies(currencyA, currencyB),
orderNumber: orderNumber
};
return this._private(parameters, callback);
},
withdraw: function(currency, amount, address, callback){
var parameters = {
command: 'withdraw',
currency: currency,
amount: amount,
address: address
};
return this._private(parameters, callback);
}
};
return Poloniex;
}();

View File

@ -1,69 +0,0 @@
var events = require('events');
var net = require('net');
var listener = module.exports = function listener(options){
var _this = this;
var emitLog = function(text){
_this.emit('log', text);
};
this.start = function(){
if (!options || !options.enabled){
emitLog('Blocknotify listener disabled');
return;
}
var blockNotifyServer = net.createServer(function(c) {
emitLog('Block listener has incoming connection');
var data = '';
try {
c.on('data', function (d) {
emitLog('Block listener received blocknotify data');
data += d;
if (data.slice(-1) === '\n') {
c.end();
}
});
c.on('end', function () {
emitLog('Block listener connection ended');
var message;
try{
message = JSON.parse(data);
}
catch(e){
emitLog('Block listener failed to parse message ' + data);
return;
}
if (message.password === options.password) {
_this.emit('hash', message);
}
else
emitLog('Block listener received notification with incorrect password');
});
}
catch(e){
emitLog('Block listener had an error: ' + e);
}
});
blockNotifyServer.listen(options.port, function() {
emitLog('Block notify listener server started on port ' + options.port)
});
emitLog("Block listener is enabled, starting server on port " + options.port);
}
};
listener.prototype.__proto__ = events.EventEmitter.prototype;

45
libs/cliListener.js Normal file
View File

@ -0,0 +1,45 @@
var events = require('events');
var net = require('net');
var listener = module.exports = function listener(port){
var _this = this;
var emitLog = function(text){
_this.emit('log', text);
};
this.start = function(){
net.createServer(function(c) {
var data = '';
try {
c.on('data', function (d) {
data += d;
if (data.slice(-1) === '\n') {
var message = JSON.parse(data);
_this.emit('command', message.command, message.params, message.options, function(message){
c.end(message);
});
}
});
c.on('end', function () {
});
c.on('error', function () {
});
}
catch(e){
emitLog('CLI listener failed to parse message ' + data);
}
}).listen(port, '127.0.0.1', function() {
emitLog('CLI listening on port ' + port)
});
}
};
listener.prototype.__proto__ = events.EventEmitter.prototype;

View File

@ -1,56 +0,0 @@
var events = require('events');
var net = require('net');
var listener = module.exports = function listener(options){
var _this = this;
var emitLog = function(text){
_this.emit('log', text);
};
this.start = function(){
if (!options || !options.enabled){
emitLog('Coinswitch listener disabled');
return;
}
var coinswitchServer = net.createServer(function(c) {
emitLog('Coinswitch listener has incoming connection');
var data = '';
try {
c.on('data', function (d) {
emitLog('Coinswitch listener received switch request');
data += d;
if (data.slice(-1) === '\n') {
c.end();
}
});
c.on('end', function () {
var message = JSON.parse(data);
if (message.password === options.password) {
_this.emit('switchcoin', message);
}
else
emitLog('Coinswitch listener received notification with incorrect password');
});
}
catch(e){
emitLog('Coinswitch listener failed to parse message ' + data);
}
});
coinswitchServer.listen(options.port, function() {
emitLog('Coinswitch notify listener server started on port ' + options.port)
});
emitLog("Coinswitch listener is enabled, starting server on port " + options.port);
}
};
listener.prototype.__proto__ = events.EventEmitter.prototype;

View File

@ -30,6 +30,7 @@ var PoolLogger = function (configuration) {
var logLevelInt = severityValues[configuration.logLevel];
var logColors = configuration.logColors;
@ -45,16 +46,28 @@ var PoolLogger = function (configuration) {
}
var entryDesc = dateFormat(new Date(), 'yyyy-mm-dd HH:MM:ss') + ' [' + system + ']\t';
entryDesc = severityToColor(severity, entryDesc);
if (logColors) {
entryDesc = severityToColor(severity, entryDesc);
var logString =
entryDesc +
('[' + component + '] ').italic;
var logString =
entryDesc +
('[' + component + '] ').italic;
if (subcat)
logString += ('(' + subcat + ') ').bold.grey
if (subcat)
logString += ('(' + subcat + ') ').bold.grey;
logString += text.grey;
logString += text.grey;
}
else {
var logString =
entryDesc +
'[' + component + '] ';
if (subcat)
logString += '(' + subcat + ') ';
logString += text;
}
console.log(logString);

View File

@ -2,60 +2,77 @@ var mysql = require('mysql');
var cluster = require('cluster');
module.exports = function(logger, poolConfig){
var mposConfig = poolConfig.shareProcessing.mpos;
var mposConfig = poolConfig.mposMode;
var coin = poolConfig.coin.name;
var connection;
var connection = mysql.createPool({
host: mposConfig.host,
port: mposConfig.port,
user: mposConfig.user,
password: mposConfig.password,
database: mposConfig.database
});
var logIdentify = 'MySQL';
var logComponent = coin;
function connect(){
connection = mysql.createConnection({
host: mposConfig.host,
port: mposConfig.port,
user: mposConfig.user,
password: mposConfig.password,
database: mposConfig.database
});
connection.connect(function(err){
if (err)
logger.error(logIdentify, logComponent, 'Could not connect to mysql database: ' + JSON.stringify(err))
else{
logger.debug(logIdentify, logComponent, 'Successful connection to MySQL database');
}
});
connection.on('error', function(err){
if(err.code === 'PROTOCOL_CONNECTION_LOST') {
logger.warning(logIdentify, logComponent, 'Lost connection to MySQL database, attempting reconnection...');
connect();
}
else{
logger.error(logIdentify, logComponent, 'Database error: ' + JSON.stringify(err))
}
});
}
connect();
this.handleAuth = function(workerName, password, authCallback){
if (poolConfig.validateWorkerUsername !== true && mposConfig.autoCreateWorker !== true){
authCallback(true);
return;
}
connection.query(
'SELECT password FROM pool_worker WHERE username = LOWER(?)',
[workerName],
[workerName.toLowerCase()],
function(err, result){
if (err){
logger.error(logIdentify, logComponent, 'Database error when authenticating worker: ' +
JSON.stringify(err));
authCallback(false);
}
else if (!result[0])
else if (!result[0]){
if(mposConfig.autoCreateWorker){
var account = workerName.split('.')[0];
connection.query(
'SELECT id,username FROM accounts WHERE username = LOWER(?)',
[account.toLowerCase()],
function(err, result){
if (err){
logger.error(logIdentify, logComponent, 'Database error when authenticating account: ' +
JSON.stringify(err));
authCallback(false);
}else if(!result[0]){
authCallback(false);
}else{
connection.query(
"INSERT INTO `pool_worker` (`account_id`, `username`, `password`) VALUES (?, ?, ?);",
[result[0].id,workerName.toLowerCase(),password],
function(err, result){
if (err){
logger.error(logIdentify, logComponent, 'Database error when insert worker: ' +
JSON.stringify(err));
authCallback(false);
}else {
authCallback(true);
}
})
}
}
);
}
else{
authCallback(false);
}
}
else if (mposConfig.checkPassword && result[0].password !== password)
authCallback(false);
else if (mposConfig.stratumAuth === 'worker')
authCallback(true);
else if (result[0].password === password)
authCallback(true)
else
authCallback(false);
authCallback(true);
}
);
@ -66,9 +83,9 @@ module.exports = function(logger, poolConfig){
var dbData = [
shareData.ip,
shareData.worker,
isValidShare ? 'Y' : 'N',
isValidShare ? 'Y' : 'N',
isValidBlock ? 'Y' : 'N',
shareData.difficulty,
shareData.difficulty * (poolConfig.coin.mposDiffMultiplier || 1),
typeof(shareData.error) === 'undefined' ? null : shareData.error,
shareData.blockHash ? shareData.blockHash : (shareData.blockHashInvalid ? shareData.blockHashInvalid : '')
];
@ -102,4 +119,4 @@ module.exports = function(logger, poolConfig){
};
};
};

View File

@ -1,8 +1,10 @@
var fs = require('fs');
var redis = require('redis');
var async = require('async');
var Stratum = require('stratum-pool');
var util = require('stratum-pool/lib/util.js');
module.exports = function(logger){
@ -13,9 +15,8 @@ module.exports = function(logger){
Object.keys(poolConfigs).forEach(function(coin) {
var poolOptions = poolConfigs[coin];
if (poolOptions.shareProcessing &&
poolOptions.shareProcessing.internal &&
poolOptions.shareProcessing.internal.enabled)
if (poolOptions.paymentProcessing &&
poolOptions.paymentProcessing.enabled)
enabledPools.push(coin);
});
@ -27,19 +28,17 @@ module.exports = function(logger){
coins.forEach(function(coin){
var poolOptions = poolConfigs[coin];
var processingConfig = poolOptions.shareProcessing.internal;
var processingConfig = poolOptions.paymentProcessing;
var logSystem = 'Payments';
var logComponent = coin;
logger.debug(logSystem, logComponent, 'Payment processing setup to run every '
+ processingConfig.paymentInterval + ' second(s) with daemon ('
+ processingConfig.daemon.user + '@' + processingConfig.daemon.host + ':' + processingConfig.daemon.port
+ ') and redis (' + processingConfig.redis.host + ':' + processingConfig.redis.port + ')');
+ ') and redis (' + poolOptions.redis.host + ':' + poolOptions.redis.port + ')');
});
});
};
@ -47,68 +46,66 @@ function SetupForPool(logger, poolOptions, setupFinished){
var coin = poolOptions.coin.name;
var processingConfig = poolOptions.shareProcessing.internal;
var processingConfig = poolOptions.paymentProcessing;
var logSystem = 'Payments';
var logComponent = coin;
var processingPayments = true;
var daemon = new Stratum.daemon.interface([processingConfig.daemon], function(severity, message){
logger[severity](logSystem, logComponent, message);
});
var redisClient = redis.createClient(poolOptions.redis.port, poolOptions.redis.host);
var daemon;
var redisClient;
var magnitude;
var minPaymentSatoshis;
var coinPrecision;
var paymentInterval;
async.parallel([
function(callback){
daemon = new Stratum.daemon.interface([processingConfig.daemon]);
daemon.once('online', function(){
daemon.cmd('validateaddress', [poolOptions.address], function(result){
if (!result[0].response || !result[0].response.ismine){
logger.error(logSystem, logComponent,
daemon.cmd('validateaddress', [poolOptions.address], function(result) {
if (result.error){
logger.error(logSystem, logComponent, 'Error with payment processing daemon ' + JSON.stringify(result.error));
callback(true);
}
else if (!result.response || !result.response.ismine) {
logger.error(logSystem, logComponent,
'Daemon does not own pool address - payment processing can not be done with this daemon, '
+ JSON.stringify(result[0].response));
return;
}
+ JSON.stringify(result.response));
callback(true);
}
else{
callback()
});
}).once('connectionFailed', function(error){
logger.error(logSystem, logComponent, 'Failed to connect to daemon for payment processing: config ' +
JSON.stringify(processingConfig.daemon) + ', error: ' +
JSON.stringify(error));
callback('Error connecting to deamon');
}).on('error', function(error){
logger.error(logSystem, logComponent, 'Daemon error ' + JSON.stringify(error));
}).init();
}
}, true);
},
function(callback){
redisClient = redis.createClient(processingConfig.redis.port, processingConfig.redis.host);
redisClient.on('ready', function(){
if (callback) {
callback();
callback = null;
daemon.cmd('getbalance', [], function(result){
if (result.error){
callback(true);
return;
}
logger.debug(logSystem, logComponent, 'Connected to redis at '
+ processingConfig.redis.host + ':' + processingConfig.redis.port + ' for payment processing');
}).on('end', function(){
logger.error(logSystem, logComponent, 'Connection to redis database as been ended');
}).once('error', function(){
if (callback) {
logger.error(logSystem, logComponent, 'Failed to connect to redis at '
+ processingConfig.redis.host + ':' + processingConfig.redis.port + ' for payment processing');
callback('Error connecting to redis');
callback = null;
try {
var d = result.data.split('result":')[1].split(',')[0].split('.')[1];
magnitude = parseInt('10' + new Array(d.length).join('0'));
minPaymentSatoshis = parseInt(processingConfig.minimumPayment * magnitude);
coinPrecision = magnitude.toString().length - 1;
callback();
}
catch(e){
logger.error(logSystem, logComponent, 'Error detecting number of satoshis in a coin, cannot do payment processing. Tried parsing: ' + result.data);
callback(true);
}
});
}, true, true);
}
], function(err){
if (err){
setupFinished(false);
return;
}
setInterval(function(){
paymentInterval = setInterval(function(){
try {
processPayments();
} catch(e){
@ -120,178 +117,166 @@ function SetupForPool(logger, poolOptions, setupFinished){
});
/* Call redis to check if previous sendmany and/or redis cleanout commands completed successfully.
If sendmany worked fine but redis commands failed you HAVE TO run redis commands again
(manually) to prevent double payments. If sendmany failed too you can safely delete
coin + '_finalRedisCommands' string from redis to let pool calculate payments again. */
function checkPreviousPaymentsStatus(callback) {
redisClient.get(coin + '_finalRedisCommands', function(error, reply) {
if (error){
callback('Could not get finalRedisCommands - ' + JSON.stringify(error));
return;
}
if (reply) {
callback('Payments stopped because of the critical error - failed commands saved in '
+ coin + '_finalRedisCommands redis set:\n' + reply);
return;
} else {
/* There was no error in previous sendmany and/or redis cleanout commands
so we can safely continue */
processingPayments = false;
callback();
}
});
}
/* Number.toFixed gives us the decimal places we want, but as a string. parseFloat turns it back into number
we don't care about trailing zeros in this case. */
var toPrecision = function(value, precision){
return parseFloat(value.toFixed(precision));
var satoshisToCoins = function(satoshis){
return parseFloat((satoshis / magnitude).toFixed(coinPrecision));
};
var coinsToSatoshies = function(coins){
return coins * magnitude;
};
/* Deal with numbers in smallest possible units (satoshis) as much as possible. This greatly helps with accuracy
when rounding and whatnot. When we are storing numbers for only humans to see, store in whole coin units. */
var processPayments = function(){
var startPaymentProcess = Date.now();
async.waterfall([
var timeSpentRPC = 0;
var timeSpentRedis = 0;
function(callback) {
if (processingPayments) {
checkPreviousPaymentsStatus(function(error){
if (error) {
logger.error(logSystem, logComponent, error);
callback('Check finished - previous payments processing error');
return;
}
callback();
});
return;
}
callback();
},
var startTimeRedis;
var startTimeRPC;
var startRedisTimer = function(){ startTimeRedis = Date.now() };
var endRedisTimer = function(){ timeSpentRedis += Date.now() - startTimeRedis };
var startRPCTimer = function(){ startTimeRPC = Date.now(); };
var endRPCTimer = function(){ timeSpentRPC += Date.now() - startTimeRedis };
async.waterfall([
/* Call redis to get an array of rounds - which are coinbase transactions and block heights from submitted
blocks. */
function(callback){
redisClient.smembers(coin + '_blocksPending', function(error, results){
startRedisTimer();
redisClient.multi([
['hgetall', coin + ':balances'],
['smembers', coin + ':blocksPending']
]).exec(function(error, results){
endRedisTimer();
if (error){
logger.error(logSystem, logComponent, 'Could not get blocks from redis ' + JSON.stringify(error));
callback('Check finished - redis error for getting blocks');
return;
}
if (results.length === 0){
callback('Check finished - no pending blocks in redis');
callback(true);
return;
}
var rounds = results.map(function(r){
var workers = {};
for (var w in results[0]){
workers[w] = {balance: coinsToSatoshies(parseFloat(results[0][w]))};
}
var rounds = results[1].map(function(r){
var details = r.split(':');
return {
category: details[0].category,
blockHash: details[0],
txHash: details[1],
height: details[2],
reward: details[3],
serialized: r
};
});
callback(null, rounds);
callback(null, workers, rounds);
});
},
/* Does a batch rpc call to daemon with all the transaction hashes to see if they are confirmed yet.
It also adds the block reward amount to the round object - which the daemon gives also gives us. */
function(rounds, callback){
function(workers, rounds, callback){
var batchRPCcommand = rounds.map(function(r){
return ['gettransaction', [r.txHash]];
});
batchRPCcommand.push(['getaccount', [poolOptions.address]]);
startRPCTimer();
daemon.batchCmd(batchRPCcommand, function(error, txDetails){
endRPCTimer();
if (error || !txDetails){
callback('Check finished - daemon rpc error with batch gettransactions ' +
JSON.stringify(error));
logger.error(logSystem, logComponent, 'Check finished - daemon rpc error with batch gettransactions '
+ JSON.stringify(error));
callback(true);
return;
}
var addressAccount;
txDetails.forEach(function(tx, i){
if (i === txDetails.length - 1){
addressAccount = tx.result;
return;
}
var round = rounds[i];
if (tx.error && tx.error.code === -5 || round.blockHash !== tx.result.blockhash){
/* Block was dropped from coin daemon even after it happily accepted it earlier. */
//If we find another block at the same height then this block was drop-kicked orphaned
var dropKicked = rounds.filter(function(r){
return r.height === round.height && r.blockHash !== round.blockHash && r.category !== 'dropkicked';
}).length > 0;
if (dropKicked){
logger.warning(logSystem, logComponent,
'A block was drop-kicked orphaned'
+ ' - we found a better block at the same height, blockHash '
+ round.blockHash + " round " + round.height);
round.category = 'dropkicked';
}
else{
/* We have no other blocks that match this height so convert to orphan in order for
shares from the round to be rewarded. */
round.category = 'orphan';
}
if (tx.error && tx.error.code === -5){
logger.warning(logSystem, logComponent, 'Daemon reports invalid transaction: ' + round.txHash);
round.category = 'kicked';
return;
}
else if (!tx.result.details || (tx.result.details && tx.result.details.length === 0)){
logger.warning(logSystem, logComponent, 'Daemon reports no details for transaction: ' + round.txHash);
round.category = 'kicked';
return;
}
else if (tx.error || !tx.result){
logger.error(logSystem, logComponent,
'Error with requesting transaction from block daemon: ' + JSON.stringify(tx));
logger.error(logSystem, logComponent, 'Odd error with gettransaction ' + round.txHash + ' '
+ JSON.stringify(tx));
return;
}
else{
round.category = tx.result.details[0].category;
if (round.category === 'generate')
round.amount = tx.result.amount;
var generationTx = tx.result.details.filter(function(tx){
return tx.address === poolOptions.address;
})[0];
if (!generationTx && tx.result.details.length === 1){
generationTx = tx.result.details[0];
}
if (!generationTx){
logger.error(logSystem, logComponent, 'Missing output details to pool address for transaction '
+ round.txHash);
return;
}
round.category = generationTx.category;
if (round.category === 'generate') {
round.reward = generationTx.amount || generationTx.value;
}
});
var canDeleteShares = function(r){
for (var i = 0; i < rounds.length; i++){
var compareR = rounds[i];
if ((compareR.height === r.height)
&& (compareR.category !== 'kicked')
&& (compareR.category !== 'orphan')
&& (compareR.serialized !== r.serialized)){
return false;
}
}
return true;
};
var magnitude;
//Filter out all rounds that are immature (not confirmed or orphaned yet)
rounds = rounds.filter(function(r){
switch (r.category) {
case 'generate':
/* Here we calculate the smallest unit in this coin's currency; the 'satoshi'.
The rpc.getblocktemplate.amount tells us how much we get in satoshis, while the
rpc.gettransaction.amount tells us how much we get in whole coin units. Therefore,
we simply divide the two to get the magnitude. I don't know math, there is probably
a better term than 'magnitude'. Sue me or do a pull request to fix it. */
var roundMagnitude = r.reward / r.amount;
if (!magnitude) {
magnitude = roundMagnitude;
if (roundMagnitude % 10 !== 0)
logger.error(logSystem, logComponent,
'Satosihis in coin is not divisible by 10 which is very odd');
}
else if (magnitude != roundMagnitude) {
/* Magnitude for a coin should ALWAYS be the same. For BTC and most coins there are
100,000,000 satoshis in one coin unit. */
logger.error(logSystem, logComponent,
'Magnitude in a round was different than in another round. HUGE PROBLEM.');
}
return true;
case 'dropkicked':
case 'orphan':
case 'kicked':
r.canDeleteShares = canDeleteShares(r);
case 'generate':
return true;
default:
return false;
@ -299,35 +284,30 @@ function SetupForPool(logger, poolOptions, setupFinished){
});
if (rounds.length === 0){
callback('Check finished - no confirmed or orphaned blocks found');
}
else{
callback(null, rounds, magnitude);
}
callback(null, workers, rounds, addressAccount);
});
},
/* Does a batch redis call to get shares contributed to each round. Then calculates the reward
amount owned to each miner for each round. */
function(rounds, magnitude, callback){
function(workers, rounds, addressAccount, callback){
var shareLookups = rounds.map(function(r){
return ['hgetall', coin + '_shares:round' + r.height]
return ['hgetall', coin + ':shares:round' + r.height]
});
startRedisTimer();
redisClient.multi(shareLookups).exec(function(error, allWorkerShares){
endRedisTimer();
if (error){
callback('Check finished - redis error with multi get rounds share')
callback('Check finished - redis error with multi get rounds share');
return;
}
var orphanMergeCommands = [];
var workerRewards = {};
rounds.forEach(function(round, i){
var workerShares = allWorkerShares[i];
@ -339,318 +319,212 @@ function SetupForPool(logger, poolOptions, setupFinished){
}
switch (round.category){
case 'kicked':
case 'orphan':
/* Each block that gets orphaned, all the shares go into the current round so that
miners still get a reward for their work. This seems unfair to those that just
started mining during this current round, but over time it balances out and rewards
loyal miners. */
Object.keys(workerShares).forEach(function(worker){
orphanMergeCommands.push(['hincrby', coin + '_shares:roundCurrent',
worker, workerShares[worker]]);
});
round.workerShares = workerShares;
break;
case 'generate':
/* We found a confirmed block! Now get the reward for it and calculate how much
we owe each miner based on the shares they submitted during that block round. */
var reward = round.reward * (1 - processingConfig.feePercent);
var reward = parseInt(round.reward * magnitude);
var totalShares = Object.keys(workerShares).reduce(function(p, c){
return p + parseInt(workerShares[c])
return p + parseFloat(workerShares[c])
}, 0);
for (var worker in workerShares){
var percent = parseInt(workerShares[worker]) / totalShares;
for (var workerAddress in workerShares){
var percent = parseFloat(workerShares[workerAddress]) / totalShares;
var workerRewardTotal = Math.floor(reward * percent);
if (!(worker in workerRewards)) workerRewards[worker] = 0;
workerRewards[worker] += workerRewardTotal;
var worker = workers[workerAddress] = (workers[workerAddress] || {});
worker.reward = (worker.reward || 0) + workerRewardTotal;
}
break;
}
});
callback(null, rounds, magnitude, workerRewards, orphanMergeCommands);
callback(null, workers, rounds, addressAccount);
});
},
/* Does a batch call to redis to get worker existing balances from coin_balances*/
function(rounds, magnitude, workerRewards, orphanMergeCommands, callback){
var workers = Object.keys(workerRewards);
redisClient.hmget([coin + '_balances'].concat(workers), function(error, results){
if (error && workers.length !== 0){
callback('Check finished - redis error with multi get balances ' + JSON.stringify(error));
return;
}
var workerBalances = {};
for (var i = 0; i < workers.length; i++){
workerBalances[workers[i]] = (parseInt(results[i]) || 0);
}
callback(null, rounds, magnitude, workerRewards, orphanMergeCommands, workerBalances);
});
},
/* Calculate if any payments are ready to be sent and trigger them sending
Get balance different for each address and pass it along as object of latest balances such as
{worker1: balance1, worker2, balance2}
when deciding the sent balance, it the difference should be -1*amount they had in db,
if not sending the balance, the differnce should be +(the amount they earned this round)
*/
function(rounds, magnitude, workerRewards, orphanMergeCommands, workerBalances, callback){
function(workers, rounds, addressAccount, callback) {
//number of satoshis in a single coin unit - this can be different for coins so we calculate it :)
daemon.cmd('getbalance', [''], function(results){
var totalBalance = results[0].response * magnitude;
var toBePaid = 0;
var workerPayments = {};
var balanceUpdateCommands = [];
var workerPayoutsCommand = [];
/* Here we add up all workers' previous unpaid balances plus their current rewards as we are
about to check if they reach the payout threshold. */
for (var worker in workerRewards){
workerPayments[worker] = ((workerPayments[worker] || 0) + workerRewards[worker]);
}
for (var worker in workerBalances){
workerPayments[worker] = ((workerPayments[worker] || 0) + workerBalances[worker]);
}
/* Here we check if any of the workers reached their payout threshold, or delete them from the
pending payment ledger (the workerPayments object). */
if (Object.keys(workerPayments).length > 0){
var coinPrecision = magnitude.toString().length - 1;
for (var worker in workerPayments){
if (workerPayments[worker] < processingConfig.minimumPayment * magnitude){
/* The workers total earnings (balance + current reward) was not enough to warrant
a transaction, so we will store their balance in the database. Next time they
are rewarded it might reach the payout threshold. */
balanceUpdateCommands.push([
'hincrby',
coin + '_balances',
worker,
workerRewards[worker]
]);
delete workerPayments[worker];
}
else{
//If worker had a balance that is about to be paid out, subtract it from the database
if (workerBalances[worker] !== 0){
balanceUpdateCommands.push([
'hincrby',
coin + '_balances',
worker,
-1 * workerBalances[worker]
]);
}
var rewardInPrecision = (workerRewards[worker] / magnitude).toFixed(coinPrecision);
workerPayoutsCommand.push(['hincrbyfloat', coin + '_payouts', worker, rewardInPrecision]);
toBePaid += workerPayments[worker];
}
var trySend = function (withholdPercent) {
var addressAmounts = {};
var totalSent = 0;
for (var w in workers) {
var worker = workers[w];
worker.balance = worker.balance || 0;
worker.reward = worker.reward || 0;
var toSend = (worker.balance + worker.reward) * (1 - withholdPercent);
if (toSend >= minPaymentSatoshis) {
totalSent += toSend;
var address = worker.address = (worker.address || getProperAddress(w));
worker.sent = addressAmounts[address] = satoshisToCoins(toSend);
worker.balanceChange = Math.min(worker.balance, toSend) * -1;
}
else {
worker.balanceChange = Math.max(toSend - worker.balance, 0);
worker.sent = 0;
}
}
// txfee included in feeAmountToBeCollected
var leftOver = toBePaid / (1 - processingConfig.feePercent);
var feeAmountToBeCollected = toPrecision(leftOver * processingConfig.feePercent, coinPrecision);
var balanceLeftOver = totalBalance - toBePaid - feeAmountToBeCollected;
var minReserveSatoshis = processingConfig.minimumReserve * magnitude;
if (balanceLeftOver < minReserveSatoshis){
/* TODO: Need to convert all these variables into whole coin units before displaying because
humans aren't good at reading satoshi units. */
callback('Check finished - payments would wipe out minimum reserve, tried to pay out ' +
toBePaid + ' and collect ' + feeAmountToBeCollected + ' as fees' +
' but only have ' + totalBalance + '. Left over balance would be ' + balanceLeftOver +
', needs to be at least ' + minReserveSatoshis);
if (Object.keys(addressAmounts).length === 0){
callback(null, workers, rounds);
return;
}
/* Move pending blocks into either orphan for confirmed sets, and delete their no longer
required round/shares data. */
var movePendingCommands = [];
var roundsToDelete = [];
rounds.forEach(function(r){
var destinationSet = (function(){
switch(r.category){
case 'orphan': return '_blocksOrphaned';
case 'generate': return '_blocksConfirmed';
case 'dropkicked': return '_blocksDropKicked';
}
})();
movePendingCommands.push(['smove', coin + '_blocksPending', coin + destinationSet, r.serialized]);
roundsToDelete.push(coin + '_shares:round' + r.height)
});
var finalRedisCommands = [];
if (movePendingCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(movePendingCommands);
if (orphanMergeCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(orphanMergeCommands);
if (balanceUpdateCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(balanceUpdateCommands);
if (workerPayoutsCommand.length > 0)
finalRedisCommands = finalRedisCommands.concat(workerPayoutsCommand);
if (roundsToDelete.length > 0)
finalRedisCommands.push(['del'].concat(roundsToDelete));
if (toBePaid !== 0)
finalRedisCommands.push(['hincrbyfloat', coin + '_stats', 'totalPaid', (toBePaid / magnitude).toFixed(coinPrecision)]);
finalRedisCommands.push(['del', coin + '_finalRedisCommands']);
finalRedisCommands.push(['bgsave']);
callback(null, magnitude, workerPayments, finalRedisCommands);
});
},
function(magnitude, workerPayments, finalRedisCommands, callback) {
/* Save final redis cleanout commands in case something goes wrong during payments */
redisClient.set(coin + '_finalRedisCommands', JSON.stringify(finalRedisCommands), function(error, reply) {
if (error){
callback('Check finished - error with saving finalRedisCommands' + JSON.stringify(error));
return;
}
callback(null, magnitude, workerPayments, finalRedisCommands);
});
},
function(magnitude, workerPayments, finalRedisCommands, callback){
//This does the final all-or-nothing atom transaction if block deamon sent payments
var finalizeRedisTx = function(){
redisClient.multi(finalRedisCommands).exec(function(error, results){
if (error){
callback('Error with final redis commands for cleaning up ' + JSON.stringify(error));
return;
daemon.cmd('sendmany', [addressAccount || '', addressAmounts], function (result) {
//Check if payments failed because wallet doesn't have enough coins to pay for tx fees
if (result.error && result.error.code === -6) {
var higherPercent = withholdPercent + 0.01;
logger.warning(logSystem, logComponent, 'Not enough funds to cover the tx fees for sending out payments, decreasing rewards by '
+ (higherPercent * 100) + '% and retrying');
trySend(higherPercent);
}
processingPayments = false;
logger.debug(logSystem, logComponent, 'Payments processing performed an interval');
else if (result.error) {
logger.error(logSystem, logComponent, 'Error trying to send payments with RPC sendmany '
+ JSON.stringify(result.error));
callback(true);
}
else {
logger.debug(logSystem, logComponent, 'Sent out a total of ' + (totalSent / magnitude)
+ ' to ' + Object.keys(addressAmounts).length + ' workers');
if (withholdPercent > 0) {
logger.warning(logSystem, logComponent, 'Had to withhold ' + (withholdPercent * 100)
+ '% of reward from miners to cover transaction fees. '
+ 'Fund pool wallet with coins to prevent this from happening');
}
callback(null, workers, rounds);
}
}, true, true);
};
trySend(0);
},
function(workers, rounds, callback){
var totalPaid = 0;
var balanceUpdateCommands = [];
var workerPayoutsCommand = [];
for (var w in workers) {
var worker = workers[w];
if (worker.balanceChange !== 0){
balanceUpdateCommands.push([
'hincrbyfloat',
coin + ':balances',
w,
satoshisToCoins(worker.balanceChange)
]);
}
if (worker.sent !== 0){
workerPayoutsCommand.push(['hincrbyfloat', coin + ':payouts', w, worker.sent]);
totalPaid += worker.sent;
}
}
var movePendingCommands = [];
var roundsToDelete = [];
var orphanMergeCommands = [];
var moveSharesToCurrent = function(r){
var workerShares = r.workerShares;
Object.keys(workerShares).forEach(function(worker){
orphanMergeCommands.push(['hincrby', coin + ':shares:roundCurrent',
worker, workerShares[worker]]);
});
};
if (Object.keys(workerPayments).length === 0){
finalizeRedisTx();
}
else{
rounds.forEach(function(r){
//This is how many decimal places to round a coin down to
var coinPrecision = magnitude.toString().length - 1;
var addressAmounts = {};
var totalAmountUnits = 0;
for (var address in workerPayments){
var coinUnits = toPrecision(workerPayments[address] / magnitude, coinPrecision);
addressAmounts[address] = coinUnits;
totalAmountUnits += coinUnits;
}
logger.debug(logSystem, logComponent, 'Payments to be sent to: ' + JSON.stringify(addressAmounts));
processingPayments = true;
daemon.cmd('sendmany', ['', addressAmounts], function(results){
if (results[0].error){
callback('Check finished - error with sendmany ' + JSON.stringify(results[0].error));
return;
}
finalizeRedisTx();
var totalWorkers = Object.keys(workerPayments).length;
logger.debug(logSystem, logComponent, 'Payments sent, a total of ' + totalAmountUnits
+ ' ' + poolOptions.coin.symbol + ' was sent to ' + totalWorkers + ' miners');
daemon.cmd('gettransaction', [results[0].response], function(results){
if (results[0].error){
callback('Check finished - error with gettransaction ' + JSON.stringify(results[0].error));
return;
switch(r.category){
case 'kicked':
movePendingCommands.push(['smove', coin + ':blocksPending', coin + ':blocksKicked', r.serialized]);
case 'orphan':
movePendingCommands.push(['smove', coin + ':blocksPending', coin + ':blocksOrphaned', r.serialized]);
if (r.canDeleteShares){
moveSharesToCurrent(r);
roundsToDelete.push(coin + ':shares:round' + r.height);
}
var feeAmountUnits = parseFloat((totalAmountUnits / (1 - processingConfig.feePercent) * processingConfig.feePercent).toFixed(coinPrecision));
var poolFees = feeAmountUnits - results[0].response.fee;
daemon.cmd('move', ['', processingConfig.feeCollectAccount, poolFees], function(results){
if (results[0].error){
callback('Check finished - error with move ' + JSON.stringify(results[0].error));
return;
}
callback(null, poolFees + ' ' + poolOptions.coin.symbol + ' collected as pool fee');
});
});
});
}
}
], function(error, result){
var paymentProcessTime = Date.now() - startPaymentProcess;
if (error)
logger.debug(logSystem, logComponent, '[Took ' + paymentProcessTime + 'ms] ' + error);
else{
logger.debug(logSystem, logComponent, '[' + paymentProcessTime + 'ms] ' + result);
// not sure if we need some time to let daemon update the wallet balance
setTimeout(withdrawalProfit, 1000);
}
});
};
var withdrawalProfit = function(){
if (!processingConfig.feeWithdrawalThreshold) return;
logger.debug(logSystem, logComponent, 'Profit withdrawal started');
daemon.cmd('getbalance', [processingConfig.feeCollectAccount], function(results){
// We have to pay some tx fee here too but maybe we shoudn't really care about it too much as long as fee is less
// then minimumReserve value. Because in this case even if feeCollectAccount account will have negative balance
// total wallet balance will be positive and feeCollectAccount account will be refilled during next payment processing.
var withdrawalAmount = results[0].response;
if (withdrawalAmount < processingConfig.feeWithdrawalThreshold){
logger.debug(logSystem, logComponent, 'Not enough profit to withdraw yet');
}
else{
var withdrawal = {};
withdrawal[processingConfig.feeReceiveAddress] = withdrawalAmount;
daemon.cmd('sendmany', [processingConfig.feeCollectAccount, withdrawal], function(results){
if (results[0].error){
logger.debug(logSystem, logComponent, 'Profit withdrawal finished - error with sendmany '
+ JSON.stringify(results[0].error));
return;
return;
case 'generate':
movePendingCommands.push(['smove', coin + ':blocksPending', coin + ':blocksConfirmed', r.serialized]);
roundsToDelete.push(coin + ':shares:round' + r.height);
return;
}
logger.debug(logSystem, logComponent, 'Profit sent, a total of ' + withdrawalAmount
+ ' ' + poolOptions.coin.symbol + ' was sent to ' + processingConfig.feeReceiveAddress);
});
var finalRedisCommands = [];
if (movePendingCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(movePendingCommands);
if (orphanMergeCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(orphanMergeCommands);
if (balanceUpdateCommands.length > 0)
finalRedisCommands = finalRedisCommands.concat(balanceUpdateCommands);
if (workerPayoutsCommand.length > 0)
finalRedisCommands = finalRedisCommands.concat(workerPayoutsCommand);
if (roundsToDelete.length > 0)
finalRedisCommands.push(['del'].concat(roundsToDelete));
if (totalPaid !== 0)
finalRedisCommands.push(['hincrbyfloat', coin + ':stats', 'totalPaid', totalPaid]);
if (finalRedisCommands.length === 0){
callback();
return;
}
startRedisTimer();
redisClient.multi(finalRedisCommands).exec(function(error, results){
endRedisTimer();
if (error){
clearInterval(paymentInterval);
logger.error(logSystem, logComponent,
'Payments sent but could not update redis. ' + JSON.stringify(error)
+ ' Disabling payment processing to prevent possible double-payouts. The redis commands in '
+ coin + '_finalRedisCommands.txt must be ran manually');
fs.writeFile(coin + '_finalRedisCommands.txt', JSON.stringify(finalRedisCommands), function(err){
logger.error('Could not write finalRedisCommands.txt, you are fucked.');
});
}
callback();
});
}
});
], function(){
var paymentProcessTime = Date.now() - startPaymentProcess;
logger.debug(logSystem, logComponent, 'Finished interval - time spent: '
+ paymentProcessTime + 'ms total, ' + timeSpentRedis + 'ms redis, '
+ timeSpentRPC + 'ms daemon RPC');
});
};
};
var getProperAddress = function(address){
if (address.length === 40){
return util.addressFromEx(poolOptions.address, address);
}
else return address;
};
}

View File

@ -18,10 +18,19 @@ module.exports = function(logger){
var proxySwitch = {};
var redisClient = redis.createClient(portalConfig.redis.port, portalConfig.redis.host);
//Handle messages from master process sent via IPC
process.on('message', function(message) {
switch(message.type){
case 'banIP':
for (var p in pools){
if (pools[p].stratumServer)
pools[p].stratumServer.addBannedIP(message.ip);
}
break;
case 'blocknotify':
var messageCoin = message.coin.toLowerCase();
@ -30,36 +39,31 @@ module.exports = function(logger){
})[0];
if (poolTarget)
pools[poolTarget].processBlockNotify(message.hash);
pools[poolTarget].processBlockNotify(message.hash, 'blocknotify script');
break;
// IPC message for pool switching
case 'switch':
case 'coinswitch':
var logSystem = 'Proxy';
var logComponent = 'Switch';
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
var messageCoin = message.coin.toLowerCase();
var newCoin = Object.keys(pools).filter(function(p){
return p.toLowerCase() === messageCoin;
})[0];
var switchName = message.switchName;
if (!newCoin){
logger.debug(logSystem, logComponent, logSubCat, 'Switch message to coin that is not recognized: ' + messageCoin);
break;
}
var newCoin = message.coin;
var algo = poolConfigs[newCoin].coin.algorithm;
var newPool = pools[newCoin];
var oldCoin = proxySwitch[algo].currentPool;
var oldPool = pools[oldCoin];
var proxyPort = proxySwitch[algo].port;
if (newCoin == oldCoin) {
var newPool = pools[newCoin];
var oldCoin = proxySwitch[switchName].currentPool;
var oldPool = pools[oldCoin];
var proxyPorts = Object.keys(proxySwitch[switchName].ports);
if (newCoin == oldCoin) {
logger.debug(logSystem, logComponent, logSubCat, 'Switch message would have no effect - ignoring ' + newCoin);
break;
}
break;
}
logger.debug(logSystem, logComponent, logSubCat, 'Proxy message for ' + algo + ' from ' + oldCoin + ' to ' + newCoin);
@ -67,25 +71,23 @@ module.exports = function(logger){
oldPool.relinquishMiners(
function (miner, cback) {
// relinquish miners that are attached to one of the "Auto-switch" ports and leave the others there.
cback(miner.client.socket.localPort == proxyPort)
cback(proxyPorts.indexOf(miner.client.socket.localPort.toString()) !== -1)
},
function (clients) {
newPool.attachMiners(clients);
}
);
proxySwitch[algo].currentPool = newCoin;
proxySwitch[switchName].currentPool = newCoin;
redisClient.hset('proxyState', algo, newCoin, function(error, obj) {
if (error) {
logger.error(logSystem, logComponent, logSubCat, 'Redis error writing proxy config: ' + JSON.stringify(err))
}
else {
logger.debug(logSystem, logComponent, logSubCat, 'Last proxy state saved to redis for ' + algo);
}
});
var redisClient = redis.createClient(6379, "localhost")
redisClient.on('ready', function(){
redisClient.hset('proxyState', algo, newCoin, function(error, obj) {
if (error) {
logger.error(logSystem, logComponent, logSubCat, 'Redis error writing proxy config: ' + JSON.stringify(err))
}
else {
logger.debug(logSystem, logComponent, logSubCat, 'Last proxy state saved to redis for ' + algo);
}
});
});
}
break;
}
@ -106,13 +108,11 @@ module.exports = function(logger){
diff: function(){}
};
var shareProcessing = poolOptions.shareProcessing;
//Functions required for MPOS compatibility
if (shareProcessing && shareProcessing.mpos && shareProcessing.mpos.enabled){
var mposCompat = new MposCompatibility(logger, poolOptions)
if (poolOptions.mposMode && poolOptions.mposMode.enabled){
var mposCompat = new MposCompatibility(logger, poolOptions);
handlers.auth = function(workerName, password, authCallback){
handlers.auth = function(port, workerName, password, authCallback){
mposCompat.handleAuth(workerName, password, authCallback);
};
@ -126,18 +126,32 @@ module.exports = function(logger){
}
//Functions required for internal payment processing
else if (shareProcessing && shareProcessing.internal && shareProcessing.internal.enabled){
else{
var shareProcessor = new ShareProcessor(logger, poolOptions)
var shareProcessor = new ShareProcessor(logger, poolOptions);
handlers.auth = function(workerName, password, authCallback){
if (shareProcessing.internal.validateWorkerAddress !== true)
handlers.auth = function(port, workerName, password, authCallback){
if (poolOptions.validateWorkerUsername !== true)
authCallback(true);
else {
pool.daemon.cmd('validateaddress', [workerName], function(results){
var isValid = results.filter(function(r){return r.response.isvalid}).length > 0;
authCallback(isValid);
});
if (workerName.length === 40) {
try {
new Buffer(workerName, 'hex');
authCallback(true);
}
catch (e) {
authCallback(false);
}
}
else {
pool.daemon.cmd('validateaddress', [workerName], function (results) {
var isValid = results.filter(function (r) {
return r.response.isvalid
}).length > 0;
authCallback(isValid);
});
}
}
};
@ -146,8 +160,8 @@ module.exports = function(logger){
};
}
var authorizeFN = function (ip, workerName, password, callback) {
handlers.auth(workerName, password, function(authorized){
var authorizeFN = function (ip, port, workerName, password, callback) {
handlers.auth(port, workerName, password, function(authorized){
var authString = authorized ? 'Authorized' : 'Unauthorized ';
@ -173,7 +187,7 @@ module.exports = function(logger){
logger.debug(logSystem, logComponent, logSubCat, 'Block found: ' + data.blockHash);
if (isValidShare)
logger.debug(logSystem, logComponent, logSubCat, 'Share accepted at diff ' + data.difficulty + ' by ' + data.worker + ' [' + data.ip + ']' );
logger.debug(logSystem, logComponent, logSubCat, 'Share accepted at diff ' + data.difficulty + '/' + data.shareDiff + ' by ' + data.worker + ' [' + data.ip + ']' );
else if (!isValidShare)
logger.debug(logSystem, logComponent, logSubCat, 'Share rejected: ' + shareData);
@ -186,6 +200,10 @@ module.exports = function(logger){
handlers.diff(workerName, diff);
}).on('log', function(severity, text) {
logger[severity](logSystem, logComponent, logSubCat, text);
}).on('banIP', function(ip, worker){
process.send({type: 'banIP', ip: ip});
}).on('started', function(){
_this.setDifficultyForProxyPort(pool, poolOptions.coin.name, poolOptions.coin.algorithm);
});
pool.start();
@ -193,9 +211,9 @@ module.exports = function(logger){
});
if (typeof(portalConfig.proxy) !== 'undefined') {
if (portalConfig.switching) {
var logSystem = 'Proxy';
var logSystem = 'Switching';
var logComponent = 'Setup';
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
@ -206,73 +224,67 @@ module.exports = function(logger){
// on the last pool it was using when reloaded or restarted
//
logger.debug(logSystem, logComponent, logSubCat, 'Loading last proxy state from redis');
var redisClient = redis.createClient(6379, "localhost")
redisClient.on('ready', function(){
redisClient.hgetall("proxyState", function(error, obj) {
if (error) {
logger.debug(logSystem, logComponent, logSubCat, 'No last proxy state found in redis');
}
else {
proxyState = obj;
logger.debug(logSystem, logComponent, logSubCat, 'Last proxy state loaded from redis');
}
//
// Setup proxySwitch object to control proxy operations from configuration and any restored
// state. Each algorithm has a listening port, current coin name, and an active pool to
// which traffic is directed when activated in the config.
//
// In addition, the proxy config also takes diff and varDiff parmeters the override the
// defaults for the standard config of the coin.
//
Object.keys(portalConfig.proxy).forEach(function(algorithm) {
if (portalConfig.proxy[algorithm].enabled === true) {
var initalPool = proxyState.hasOwnProperty(algorithm) ? proxyState[algorithm] : _this.getFirstPoolForAlgorithm(algorithm);
proxySwitch[algorithm] = {
port: portalConfig.proxy[algorithm].port,
currentPool: initalPool,
proxy: {}
};
// Copy diff and vardiff configuation into pools that match our algorithm so the stratum server can pick them up
//
// Note: This seems a bit wonky and brittle - better if proxy just used the diff config of the port it was
// routed into instead.
//
if (portalConfig.proxy[algorithm].hasOwnProperty('varDiff')) {
proxySwitch[algorithm].varDiff = new Stratum.varDiff(proxySwitch[algorithm].port, portalConfig.proxy[algorithm].varDiff);
proxySwitch[algorithm].diff = portalConfig.proxy[algorithm].diff;
}
Object.keys(pools).forEach(function (coinName) {
var a = poolConfigs[coinName].coin.algorithm;
var p = pools[coinName];
if (a === algorithm) {
p.setVarDiff(proxySwitch[algorithm].port, proxySwitch[algorithm].varDiff);
}
});
proxySwitch[algorithm].proxy = net.createServer(function(socket) {
var currentPool = proxySwitch[algorithm].currentPool;
var logSubCat = 'Thread ' + (parseInt(forkId) + 1);
logger.debug(logSystem, 'Connect', logSubCat, 'Proxy connect from ' + socket.remoteAddress + ' on ' + proxySwitch[algorithm].port
+ ' routing to ' + currentPool);
pools[currentPool].getStratumServer().handleNewClient(socket);
}).listen(parseInt(proxySwitch[algorithm].port), function() {
logger.debug(logSystem, logComponent, logSubCat, 'Proxy listening for ' + algorithm + ' on port ' + proxySwitch[algorithm].port
+ ' into ' + proxySwitch[algorithm].currentPool);
});
}
else {
logger.debug(logSystem, logComponent, logSubCat, 'Proxy pool for ' + algorithm + ' disabled.');
}
});
});
}).on('error', function(err){
/*redisClient.on('error', function(err){
logger.debug(logSystem, logComponent, logSubCat, 'Pool configuration failed: ' + err);
});*/
redisClient.hgetall("proxyState", function(error, obj) {
if (!error && obj) {
proxyState = obj;
logger.debug(logSystem, logComponent, logSubCat, 'Last proxy state loaded from redis');
}
//
// Setup proxySwitch object to control proxy operations from configuration and any restored
// state. Each algorithm has a listening port, current coin name, and an active pool to
// which traffic is directed when activated in the config.
//
// In addition, the proxy config also takes diff and varDiff parmeters the override the
// defaults for the standard config of the coin.
//
Object.keys(portalConfig.switching).forEach(function(switchName) {
var algorithm = portalConfig.switching[switchName].algorithm;
if (!portalConfig.switching[switchName].enabled) return;
var initalPool = proxyState.hasOwnProperty(algorithm) ? proxyState[algorithm] : _this.getFirstPoolForAlgorithm(algorithm);
proxySwitch[switchName] = {
algorithm: algorithm,
ports: portalConfig.switching[switchName].ports,
currentPool: initalPool,
servers: []
};
Object.keys(proxySwitch[switchName].ports).forEach(function(port){
var f = net.createServer(function(socket) {
var currentPool = proxySwitch[switchName].currentPool;
logger.debug(logSystem, 'Connect', logSubCat, 'Connection to '
+ switchName + ' from '
+ socket.remoteAddress + ' on '
+ port + ' routing to ' + currentPool);
if (pools[currentPool])
pools[currentPool].getStratumServer().handleNewClient(socket);
else
pools[initialPool].getStratumServer().handleNewClient(socket);
}).listen(parseInt(port), function() {
logger.debug(logSystem, logComponent, logSubCat, 'Switching "' + switchName
+ '" listening for ' + algorithm
+ ' on port ' + port
+ ' into ' + proxySwitch[switchName].currentPool);
});
proxySwitch[switchName].servers.push(f);
});
});
});
}
@ -286,4 +298,34 @@ module.exports = function(logger){
});
return foundCoin;
};
//
// Called when stratum pool emits its 'started' event to copy the initial diff and vardiff
// configuation for any proxy switching ports configured into the stratum pool object.
//
this.setDifficultyForProxyPort = function(pool, coin, algo) {
logger.debug(logSystem, logComponent, algo, 'Setting proxy difficulties after pool start');
Object.keys(portalConfig.switching).forEach(function(switchName) {
if (!portalConfig.switching[switchName].enabled) return;
var switchAlgo = portalConfig.switching[switchName].algorithm;
if (pool.options.coin.algorithm !== switchAlgo) return;
// we know the switch configuration matches the pool's algo, so setup the diff and
// vardiff for each of the switch's ports
for (var port in portalConfig.switching[switchName].ports) {
if (portalConfig.switching[switchName].ports[port].varDiff)
pool.setVarDiff(port, portalConfig.switching[switchName].ports[port].varDiff);
if (portalConfig.switching[switchName].ports[port].diff){
if (!pool.options.ports.hasOwnProperty(port))
pool.options.ports[port] = {};
pool.options.ports[port].diff = portalConfig.switching[switchName].ports[port].diff;
}
}
});
};
};

666
libs/profitSwitch.js Normal file
View File

@ -0,0 +1,666 @@
var async = require('async');
var net = require('net');
var bignum = require('bignum');
var algos = require('stratum-pool/lib/algoProperties.js');
var util = require('stratum-pool/lib/util.js');
var Cryptsy = require('./apiCryptsy.js');
var Poloniex = require('./apiPoloniex.js');
var Mintpal = require('./apiMintpal.js');
var Bittrex = require('./apiBittrex.js');
var Stratum = require('stratum-pool');
module.exports = function(logger){
var _this = this;
var portalConfig = JSON.parse(process.env.portalConfig);
var poolConfigs = JSON.parse(process.env.pools);
var logSystem = 'Profit';
//
// build status tracker for collecting coin market information
//
var profitStatus = {};
var symbolToAlgorithmMap = {};
Object.keys(poolConfigs).forEach(function(coin){
var poolConfig = poolConfigs[coin];
var algo = poolConfig.coin.algorithm;
if (!profitStatus.hasOwnProperty(algo)) {
profitStatus[algo] = {};
}
var coinStatus = {
name: poolConfig.coin.name,
symbol: poolConfig.coin.symbol,
difficulty: 0,
reward: 0,
exchangeInfo: {}
};
profitStatus[algo][poolConfig.coin.symbol] = coinStatus;
symbolToAlgorithmMap[poolConfig.coin.symbol] = algo;
});
//
// ensure we have something to switch
//
Object.keys(profitStatus).forEach(function(algo){
if (Object.keys(profitStatus[algo]).length <= 1) {
delete profitStatus[algo];
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
if (symbolToAlgorithmMap[symbol] === algo)
delete symbolToAlgorithmMap[symbol];
});
}
});
if (Object.keys(profitStatus).length == 0){
logger.debug(logSystem, 'Config', 'No alternative coins to switch to in current config, switching disabled.');
return;
}
//
// setup APIs
//
var poloApi = new Poloniex(
// 'API_KEY',
// 'API_SECRET'
);
var cryptsyApi = new Cryptsy(
// 'API_KEY',
// 'API_SECRET'
);
var mintpalApi = new Mintpal(
// 'API_KEY',
// 'API_SECRET'
);
var bittrexApi = new Bittrex(
// 'API_KEY',
// 'API_SECRET'
);
//
// market data collection from Poloniex
//
this.getProfitDataPoloniex = function(callback){
async.series([
function(taskCallback){
poloApi.getTicker(function(err, data){
if (err){
taskCallback(err);
return;
}
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var exchangeInfo = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo;
if (!exchangeInfo.hasOwnProperty('Poloniex'))
exchangeInfo['Poloniex'] = {};
var marketData = exchangeInfo['Poloniex'];
if (data.hasOwnProperty('BTC_' + symbol)) {
if (!marketData.hasOwnProperty('BTC'))
marketData['BTC'] = {};
var btcData = data['BTC_' + symbol];
marketData['BTC'].ask = new Number(btcData.lowestAsk);
marketData['BTC'].bid = new Number(btcData.highestBid);
marketData['BTC'].last = new Number(btcData.last);
marketData['BTC'].baseVolume = new Number(btcData.baseVolume);
marketData['BTC'].quoteVolume = new Number(btcData.quoteVolume);
}
if (data.hasOwnProperty('LTC_' + symbol)) {
if (!marketData.hasOwnProperty('LTC'))
marketData['LTC'] = {};
var ltcData = data['LTC_' + symbol];
marketData['LTC'].ask = new Number(ltcData.lowestAsk);
marketData['LTC'].bid = new Number(ltcData.highestBid);
marketData['LTC'].last = new Number(ltcData.last);
marketData['LTC'].baseVolume = new Number(ltcData.baseVolume);
marketData['LTC'].quoteVolume = new Number(ltcData.quoteVolume);
}
// save LTC to BTC exchange rate
if (marketData.hasOwnProperty('LTC') && data.hasOwnProperty('BTC_LTC')) {
var btcLtc = data['BTC_LTC'];
marketData['LTC'].ltcToBtc = new Number(btcLtc.highestBid);
}
});
taskCallback();
});
},
function(taskCallback){
var depthTasks = [];
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var marketData = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo['Poloniex'];
if (marketData.hasOwnProperty('BTC') && marketData['BTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromPoloniex('BTC', symbol, marketData['BTC'].bid, callback)
});
}
if (marketData.hasOwnProperty('LTC') && marketData['LTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromPoloniex('LTC', symbol, marketData['LTC'].bid, callback)
});
}
});
if (!depthTasks.length){
taskCallback();
return;
}
async.series(depthTasks, function(err){
if (err){
taskCallback(err);
return;
}
taskCallback();
});
}
], function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getMarketDepthFromPoloniex = function(symbolA, symbolB, coinPrice, callback){
poloApi.getOrderBook(symbolA, symbolB, function(err, data){
if (err){
callback(err);
return;
}
var depth = new Number(0);
var totalQty = new Number(0);
if (data.hasOwnProperty('bids')){
data['bids'].forEach(function(order){
var price = new Number(order[0]);
var limit = new Number(coinPrice * portalConfig.profitSwitch.depth);
var qty = new Number(order[1]);
// only measure the depth down to configured depth
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
}
var marketData = profitStatus[symbolToAlgorithmMap[symbolB]][symbolB].exchangeInfo['Poloniex'];
marketData[symbolA].depth = depth;
if (totalQty > 0)
marketData[symbolA].weightedBid = new Number(depth / totalQty);
callback();
});
};
this.getProfitDataCryptsy = function(callback){
async.series([
function(taskCallback){
cryptsyApi.getTicker(function(err, data){
if (err || data.success != 1){
taskCallback(err);
return;
}
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var exchangeInfo = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo;
if (!exchangeInfo.hasOwnProperty('Cryptsy'))
exchangeInfo['Cryptsy'] = {};
var marketData = exchangeInfo['Cryptsy'];
var results = data.return.markets;
if (results && results.hasOwnProperty(symbol + '/BTC')) {
if (!marketData.hasOwnProperty('BTC'))
marketData['BTC'] = {};
var btcData = results[symbol + '/BTC'];
marketData['BTC'].last = new Number(btcData.lasttradeprice);
marketData['BTC'].baseVolume = new Number(marketData['BTC'].last / btcData.volume);
marketData['BTC'].quoteVolume = new Number(btcData.volume);
if (btcData.sellorders != null)
marketData['BTC'].ask = new Number(btcData.sellorders[0].price);
if (btcData.buyorders != null) {
marketData['BTC'].bid = new Number(btcData.buyorders[0].price);
var limit = new Number(marketData['BTC'].bid * portalConfig.profitSwitch.depth);
var depth = new Number(0);
var totalQty = new Number(0);
btcData['buyorders'].forEach(function(order){
var price = new Number(order.price);
var qty = new Number(order.quantity);
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
marketData['BTC'].depth = depth;
if (totalQty > 0)
marketData['BTC'].weightedBid = new Number(depth / totalQty);
}
}
if (results && results.hasOwnProperty(symbol + '/LTC')) {
if (!marketData.hasOwnProperty('LTC'))
marketData['LTC'] = {};
var ltcData = results[symbol + '/LTC'];
marketData['LTC'].last = new Number(ltcData.lasttradeprice);
marketData['LTC'].baseVolume = new Number(marketData['LTC'].last / ltcData.volume);
marketData['LTC'].quoteVolume = new Number(ltcData.volume);
if (ltcData.sellorders != null)
marketData['LTC'].ask = new Number(ltcData.sellorders[0].price);
if (ltcData.buyorders != null) {
marketData['LTC'].bid = new Number(ltcData.buyorders[0].price);
var limit = new Number(marketData['LTC'].bid * portalConfig.profitSwitch.depth);
var depth = new Number(0);
var totalQty = new Number(0);
ltcData['buyorders'].forEach(function(order){
var price = new Number(order.price);
var qty = new Number(order.quantity);
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
marketData['LTC'].depth = depth;
if (totalQty > 0)
marketData['LTC'].weightedBid = new Number(depth / totalQty);
}
}
});
taskCallback();
});
}
], function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getProfitDataMintpal = function(callback){
async.series([
function(taskCallback){
mintpalApi.getTicker(function(err, response){
if (err || !response.data){
taskCallback(err);
return;
}
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
response.data.forEach(function(market){
var exchangeInfo = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo;
if (!exchangeInfo.hasOwnProperty('Mintpal'))
exchangeInfo['Mintpal'] = {};
var marketData = exchangeInfo['Mintpal'];
if (market.exchange == 'BTC' && market.code == symbol) {
if (!marketData.hasOwnProperty('BTC'))
marketData['BTC'] = {};
marketData['BTC'].last = new Number(market.last_price);
marketData['BTC'].baseVolume = new Number(market['24hvol']);
marketData['BTC'].quoteVolume = new Number(market['24hvol'] / market.last_price);
marketData['BTC'].ask = new Number(market.top_ask);
marketData['BTC'].bid = new Number(market.top_bid);
}
if (market.exchange == 'LTC' && market.code == symbol) {
if (!marketData.hasOwnProperty('LTC'))
marketData['LTC'] = {};
marketData['LTC'].last = new Number(market.last_price);
marketData['LTC'].baseVolume = new Number(market['24hvol']);
marketData['LTC'].quoteVolume = new Number(market['24hvol'] / market.last_price);
marketData['LTC'].ask = new Number(market.top_ask);
marketData['LTC'].bid = new Number(market.top_bid);
}
});
});
taskCallback();
});
},
function(taskCallback){
var depthTasks = [];
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var marketData = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo['Mintpal'];
if (marketData.hasOwnProperty('BTC') && marketData['BTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromMintpal('BTC', symbol, marketData['BTC'].bid, callback)
});
}
if (marketData.hasOwnProperty('LTC') && marketData['LTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromMintpal('LTC', symbol, marketData['LTC'].bid, callback)
});
}
});
if (!depthTasks.length){
taskCallback();
return;
}
async.series(depthTasks, function(err){
if (err){
taskCallback(err);
return;
}
taskCallback();
});
}
], function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getMarketDepthFromMintpal = function(symbolA, symbolB, coinPrice, callback){
mintpalApi.getBuyOrderBook(symbolA, symbolB, function(err, response){
if (err){
callback(err);
return;
}
var depth = new Number(0);
if (response.hasOwnProperty('data')){
var totalQty = new Number(0);
response['data'].forEach(function(order){
var price = new Number(order.price);
var limit = new Number(coinPrice * portalConfig.profitSwitch.depth);
var qty = new Number(order.amount);
// only measure the depth down to configured depth
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
}
var marketData = profitStatus[symbolToAlgorithmMap[symbolB]][symbolB].exchangeInfo['Mintpal'];
marketData[symbolA].depth = depth;
if (totalQty > 0)
marketData[symbolA].weightedBid = new Number(depth / totalQty);
callback();
});
};
this.getProfitDataBittrex = function(callback){
async.series([
function(taskCallback){
bittrexApi.getTicker(function(err, response){
if (err || !response.result){
taskCallback(err);
return;
}
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
response.result.forEach(function(market){
var exchangeInfo = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo;
if (!exchangeInfo.hasOwnProperty('Bittrex'))
exchangeInfo['Bittrex'] = {};
var marketData = exchangeInfo['Bittrex'];
var marketPair = market.MarketName.match(/([\w]+)-([\w-_]+)/)
market.exchange = marketPair[1]
market.code = marketPair[2]
if (market.exchange == 'BTC' && market.code == symbol) {
if (!marketData.hasOwnProperty('BTC'))
marketData['BTC'] = {};
marketData['BTC'].last = new Number(market.Last);
marketData['BTC'].baseVolume = new Number(market.BaseVolume);
marketData['BTC'].quoteVolume = new Number(market.BaseVolume / market.Last);
marketData['BTC'].ask = new Number(market.Ask);
marketData['BTC'].bid = new Number(market.Bid);
}
if (market.exchange == 'LTC' && market.code == symbol) {
if (!marketData.hasOwnProperty('LTC'))
marketData['LTC'] = {};
marketData['LTC'].last = new Number(market.Last);
marketData['LTC'].baseVolume = new Number(market.BaseVolume);
marketData['LTC'].quoteVolume = new Number(market.BaseVolume / market.Last);
marketData['LTC'].ask = new Number(market.Ask);
marketData['LTC'].bid = new Number(market.Bid);
}
});
});
taskCallback();
});
},
function(taskCallback){
var depthTasks = [];
Object.keys(symbolToAlgorithmMap).forEach(function(symbol){
var marketData = profitStatus[symbolToAlgorithmMap[symbol]][symbol].exchangeInfo['Bittrex'];
if (marketData.hasOwnProperty('BTC') && marketData['BTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromBittrex('BTC', symbol, marketData['BTC'].bid, callback)
});
}
if (marketData.hasOwnProperty('LTC') && marketData['LTC'].bid > 0){
depthTasks.push(function(callback){
_this.getMarketDepthFromBittrex('LTC', symbol, marketData['LTC'].bid, callback)
});
}
});
if (!depthTasks.length){
taskCallback();
return;
}
async.series(depthTasks, function(err){
if (err){
taskCallback(err);
return;
}
taskCallback();
});
}
], function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getMarketDepthFromBittrex = function(symbolA, symbolB, coinPrice, callback){
bittrexApi.getOrderBook(symbolA, symbolB, function(err, response){
if (err){
callback(err);
return;
}
var depth = new Number(0);
if (response.hasOwnProperty('result')){
var totalQty = new Number(0);
response['result'].forEach(function(order){
var price = new Number(order.Rate);
var limit = new Number(coinPrice * portalConfig.profitSwitch.depth);
var qty = new Number(order.Quantity);
// only measure the depth down to configured depth
if (price >= limit){
depth += (qty * price);
totalQty += qty;
}
});
}
var marketData = profitStatus[symbolToAlgorithmMap[symbolB]][symbolB].exchangeInfo['Bittrex'];
marketData[symbolA].depth = depth;
if (totalQty > 0)
marketData[symbolA].weightedBid = new Number(depth / totalQty);
callback();
});
};
this.getCoindDaemonInfo = function(callback){
var daemonTasks = [];
Object.keys(profitStatus).forEach(function(algo){
Object.keys(profitStatus[algo]).forEach(function(symbol){
var coinName = profitStatus[algo][symbol].name;
var poolConfig = poolConfigs[coinName];
var daemonConfig = poolConfig.paymentProcessing.daemon;
daemonTasks.push(function(callback){
_this.getDaemonInfoForCoin(symbol, daemonConfig, callback)
});
});
});
if (daemonTasks.length == 0){
callback();
return;
}
async.series(daemonTasks, function(err){
if (err){
callback(err);
return;
}
callback(null);
});
};
this.getDaemonInfoForCoin = function(symbol, cfg, callback){
var daemon = new Stratum.daemon.interface([cfg], function(severity, message){
logger[severity](logSystem, symbol, message);
callback(null); // fail gracefully for each coin
});
daemon.cmd('getblocktemplate', [{"capabilities": [ "coinbasetxn", "workid", "coinbase/append" ]}], function(result) {
if (result[0].error != null) {
logger.error(logSystem, symbol, 'Error while reading daemon info: ' + JSON.stringify(result[0]));
callback(null); // fail gracefully for each coin
return;
}
var coinStatus = profitStatus[symbolToAlgorithmMap[symbol]][symbol];
var response = result[0].response;
// some shitcoins dont provide target, only bits, so we need to deal with both
var target = response.target ? bignum(response.target, 16) : util.bignumFromBitsHex(response.bits);
coinStatus.difficulty = parseFloat((diff1 / target.toNumber()).toFixed(9));
logger.debug(logSystem, symbol, 'difficulty is ' + coinStatus.difficulty);
coinStatus.reward = response.coinbasevalue / 100000000;
callback(null);
});
};
this.getMiningRate = function(callback){
var daemonTasks = [];
Object.keys(profitStatus).forEach(function(algo){
Object.keys(profitStatus[algo]).forEach(function(symbol){
var coinStatus = profitStatus[symbolToAlgorithmMap[symbol]][symbol];
coinStatus.blocksPerMhPerHour = 86400 / ((coinStatus.difficulty * Math.pow(2,32)) / (1 * 1000 * 1000));
coinStatus.coinsPerMhPerHour = coinStatus.reward * coinStatus.blocksPerMhPerHour;
});
});
callback(null);
};
this.switchToMostProfitableCoins = function() {
Object.keys(profitStatus).forEach(function(algo) {
var algoStatus = profitStatus[algo];
var bestExchange;
var bestCoin;
var bestBtcPerMhPerHour = 0;
Object.keys(profitStatus[algo]).forEach(function(symbol) {
var coinStatus = profitStatus[algo][symbol];
Object.keys(coinStatus.exchangeInfo).forEach(function(exchange){
var exchangeData = coinStatus.exchangeInfo[exchange];
if (exchangeData.hasOwnProperty('BTC') && exchangeData['BTC'].hasOwnProperty('weightedBid')){
var btcPerMhPerHour = exchangeData['BTC'].weightedBid * coinStatus.coinsPerMhPerHour;
if (btcPerMhPerHour > bestBtcPerMhPerHour){
bestBtcPerMhPerHour = btcPerMhPerHour;
bestExchange = exchange;
bestCoin = profitStatus[algo][symbol].name;
}
coinStatus.btcPerMhPerHour = btcPerMhPerHour;
logger.debug(logSystem, 'CALC', 'BTC/' + symbol + ' on ' + exchange + ' with ' + coinStatus.btcPerMhPerHour.toFixed(8) + ' BTC/day per Mh/s');
}
if (exchangeData.hasOwnProperty('LTC') && exchangeData['LTC'].hasOwnProperty('weightedBid')){
var btcPerMhPerHour = (exchangeData['LTC'].weightedBid * coinStatus.coinsPerMhPerHour) * exchangeData['LTC'].ltcToBtc;
if (btcPerMhPerHour > bestBtcPerMhPerHour){
bestBtcPerMhPerHour = btcPerMhPerHour;
bestExchange = exchange;
bestCoin = profitStatus[algo][symbol].name;
}
coinStatus.btcPerMhPerHour = btcPerMhPerHour;
logger.debug(logSystem, 'CALC', 'LTC/' + symbol + ' on ' + exchange + ' with ' + coinStatus.btcPerMhPerHour.toFixed(8) + ' BTC/day per Mh/s');
}
});
});
logger.debug(logSystem, 'RESULT', 'Best coin for ' + algo + ' is ' + bestCoin + ' on ' + bestExchange + ' with ' + bestBtcPerMhPerHour.toFixed(8) + ' BTC/day per Mh/s');
var client = net.connect(portalConfig.cliPort, function () {
client.write(JSON.stringify({
command: 'coinswitch',
params: [bestCoin],
options: {algorithm: algo}
}) + '\n');
}).on('error', function(error){
if (error.code === 'ECONNREFUSED')
logger.error(logSystem, 'CLI', 'Could not connect to NOMP instance on port ' + portalConfig.cliPort);
else
logger.error(logSystem, 'CLI', 'Socket error ' + JSON.stringify(error));
});
});
};
var checkProfitability = function(){
logger.debug(logSystem, 'Check', 'Collecting profitability data.');
profitabilityTasks = [];
if (portalConfig.profitSwitch.usePoloniex)
profitabilityTasks.push(_this.getProfitDataPoloniex);
if (portalConfig.profitSwitch.useCryptsy)
profitabilityTasks.push(_this.getProfitDataCryptsy);
if (portalConfig.profitSwitch.useMintpal)
profitabilityTasks.push(_this.getProfitDataMintpal);
if (portalConfig.profitSwitch.useBittrex)
profitabilityTasks.push(_this.getProfitDataBittrex);
profitabilityTasks.push(_this.getCoindDaemonInfo);
profitabilityTasks.push(_this.getMiningRate);
// has to be series
async.series(profitabilityTasks, function(err){
if (err){
logger.error(logSystem, 'Check', 'Error while checking profitability: ' + err);
return;
}
//
// TODO offer support for a userConfigurable function for deciding on coin to override the default
//
_this.switchToMostProfitableCoins();
});
};
setInterval(checkProfitability, portalConfig.profitSwitch.updateInterval * 1000);
};

View File

@ -1,36 +0,0 @@
var events = require('events');
var redis = require('redis');
var listener = module.exports = function listener(options){
var _this = this;
var redisConnection;
var emitLog = function(text){
_this.emit('log', text);
};
this.start = function(){
redisConnection = redis.createClient(options.redisPort, options.redisHost);
redisConnection.on("pmessage", function (pattern, channel, message) {
var coinname = channel.split(':')[1];
var blockhash = message;
//emitLog("Redis: Received block for "+coinname+" - hash: "+blockhash);
_this.emit('hash', {
"coin" : coinname,
"hash" : blockhash
});
});
redisConnection.on('connect', function (err, data) {
emitLog("Redis connected");
});
redisConnection.psubscribe(options.psubscribeKey);
emitLog("Connecting to redis!");
}
};
listener.prototype.__proto__ = events.EventEmitter.prototype;

View File

@ -14,12 +14,13 @@ value: a hash with..
*/
module.exports = function(logger, poolConfig){
var internalConfig = poolConfig.shareProcessing.internal;
var redisConfig = internalConfig.redis;
var redisConfig = poolConfig.redis;
var coin = poolConfig.coin.name;
var forkId = process.env.forkId;
var logSystem = 'Pool';
var logComponent = coin;
@ -38,41 +39,63 @@ module.exports = function(logger, poolConfig){
logger.error(logSystem, logComponent, logSubCat, 'Connection to redis database as been ended');
});
connection.info(function(error, response){
if (error){
logger.error(logSystem, logComponent, logSubCat, 'Redis version check failed');
return;
}
var parts = response.split('\r\n');
var version;
var versionString;
for (var i = 0; i < parts.length; i++){
if (parts[i].indexOf(':') !== -1){
var valParts = parts[i].split(':');
if (valParts[0] === 'redis_version'){
versionString = valParts[1];
version = parseFloat(versionString);
break;
}
}
}
if (!version){
logger.error(logSystem, logComponent, logSubCat, 'Could not detect redis version - but be super old or broken');
}
else if (version < 2.6){
logger.error(logSystem, logComponent, logSubCat, "You're using redis version " + versionString + " the minimum required version is 2.6. Follow the damn usage instructions...");
}
});
this.handleShare = function(isValidShare, isValidBlock, shareData){
var redisCommands = [];
if (isValidShare){
redisCommands.push(['hincrbyfloat', coin + '_shares:roundCurrent', shareData.worker, shareData.difficulty]);
redisCommands.push(['hincrby', coin + '_stats', 'validShares', 1]);
/* Stores share diff, worker, and unique value with a score that is the timestamp. Unique value ensures it
doesn't overwrite an existing entry, and timestamp as score lets us query shares from last X minutes to
generate hashrate for each worker and pool. */
var dateNow = Date.now();
redisCommands.push(['zadd', coin + '_hashrate', dateNow / 1000 | 0, [shareData.difficulty, shareData.worker, dateNow].join(':')]);
redisCommands.push(['hincrbyfloat', coin + ':shares:roundCurrent', shareData.worker, shareData.difficulty]);
redisCommands.push(['hincrby', coin + ':stats', 'validShares', 1]);
}
else{
redisCommands.push(['hincrby', coin + '_stats', 'invalidShares', 1]);
redisCommands.push(['hincrby', coin + ':stats', 'invalidShares', 1]);
}
/* Stores share diff, worker, and unique value with a score that is the timestamp. Unique value ensures it
doesn't overwrite an existing entry, and timestamp as score lets us query shares from last X minutes to
generate hashrate for each worker and pool. */
var dateNow = Date.now();
var hashrateData = [ isValidShare ? shareData.difficulty : -shareData.difficulty, shareData.worker, dateNow];
redisCommands.push(['zadd', coin + ':hashrate', dateNow / 1000 | 0, hashrateData.join(':')]);
if (isValidBlock){
redisCommands.push(['rename', coin + '_shares:roundCurrent', coin + '_shares:round' + shareData.height]);
redisCommands.push(['sadd', coin + '_blocksPending', [shareData.blockHash, shareData.txHash, shareData.height, shareData.reward].join(':')]);
redisCommands.push(['hincrby', coin + '_stats', 'validBlocks', 1]);
redisCommands.push(['rename', coin + ':shares:roundCurrent', coin + ':shares:round' + shareData.height]);
redisCommands.push(['sadd', coin + ':blocksPending', [shareData.blockHash, shareData.txHash, shareData.height].join(':')]);
redisCommands.push(['hincrby', coin + ':stats', 'validBlocks', 1]);
}
else if (shareData.blockHash){
redisCommands.push(['hincrby', coin + '_stats', 'invalidBlocks', 1]);
redisCommands.push(['hincrby', coin + ':stats', 'invalidBlocks', 1]);
}
connection.multi(redisCommands).exec(function(err, replies){
if (err)
logger.error(logSystem, logComponent, logSubCat, 'Error with share processor multi ' + JSON.stringify(err));
else
logger.debug(logSystem, logComponent, logSubCat, 'Share data and stats recorded');
});

View File

@ -20,7 +20,6 @@ module.exports = function(logger, portalConfig, poolConfigs){
this.statHistory = [];
this.statPoolHistory = [];
this.statPoolHistoryBuffer;
this.stats = {};
this.statsString = '';
@ -36,14 +35,7 @@ module.exports = function(logger, portalConfig, poolConfigs){
var poolConfig = poolConfigs[coin];
if (!poolConfig.shareProcessing || !poolConfig.shareProcessing.internal){
logger.error(logSystem, coin, 'Cannot do stats without internal share processing setup');
canDoStats = false;
return;
}
var internalConfig = poolConfig.shareProcessing.internal;
var redisConfig = internalConfig.redis;
var redisConfig = poolConfig.redis;
for (var i = 0; i < redisClients.length; i++){
var client = redisClients[i];
@ -60,7 +52,7 @@ module.exports = function(logger, portalConfig, poolConfigs){
function setupStatsRedis(){
redisStats = redis.createClient(portalConfig.website.stats.redis.port, portalConfig.website.stats.redis.host);
redisStats = redis.createClient(portalConfig.redis.port, portalConfig.redis.host);
redisStats.on('error', function(err){
logger.error(logSystem, 'Historics', 'Redis for stats had an error ' + JSON.stringify(err));
});
@ -84,7 +76,6 @@ module.exports = function(logger, portalConfig, poolConfigs){
_this.statHistory.forEach(function(stats){
addStatPoolHistory(stats);
});
deflateStatPoolHistory();
});
}
@ -96,7 +87,7 @@ module.exports = function(logger, portalConfig, poolConfigs){
for (var pool in stats.pools){
data.pools[pool] = {
hashrate: stats.pools[pool].hashrate,
workers: stats.pools[pool].workerCount,
workerCount: stats.pools[pool].workerCount,
blocks: stats.pools[pool].blocks
}
}
@ -104,11 +95,7 @@ module.exports = function(logger, portalConfig, poolConfigs){
}
function deflateStatPoolHistory(){
zlib.gzip(JSON.stringify(_this.statPoolHistory), function(err, buffer){
_this.statPoolHistoryBuffer = buffer;
});
}
this.getGlobalStats = function(callback){
@ -121,19 +108,19 @@ module.exports = function(logger, portalConfig, poolConfigs){
var redisCommands = [];
var redisComamndTemplates = [
['zremrangebyscore', '_hashrate', '-inf', '(' + windowTime],
['zrangebyscore', '_hashrate', windowTime, '+inf'],
['hgetall', '_stats'],
['scard', '_blocksPending'],
['scard', '_blocksConfirmed'],
['scard', '_blocksOrphaned']
var redisCommandTemplates = [
['zremrangebyscore', ':hashrate', '-inf', '(' + windowTime],
['zrangebyscore', ':hashrate', windowTime, '+inf'],
['hgetall', ':stats'],
['scard', ':blocksPending'],
['scard', ':blocksConfirmed'],
['scard', ':blocksOrphaned']
];
var commandsPerCoin = redisComamndTemplates.length;
var commandsPerCoin = redisCommandTemplates.length;
client.coins.map(function(coin){
redisComamndTemplates.map(function(t){
redisCommandTemplates.map(function(t){
var clonedTemplates = t.slice(0);
clonedTemplates[1] = coin + clonedTemplates[1];
redisCommands.push(clonedTemplates);
@ -154,7 +141,12 @@ module.exports = function(logger, portalConfig, poolConfigs){
symbol: poolConfigs[coinName].coin.symbol.toUpperCase(),
algorithm: poolConfigs[coinName].coin.algorithm,
hashrates: replies[i + 1],
poolStats: replies[i + 2] != null ? replies[i + 2] : { validShares: 0, validBlocks: 0, invalidShares: 0 },
poolStats: {
validShares: replies[i + 2] ? (replies[i + 2].validShares || 0) : 0,
validBlocks: replies[i + 2] ? (replies[i + 2].validBlocks || 0) : 0,
invalidShares: replies[i + 2] ? (replies[i + 2].invalidShares || 0) : 0,
totalPaid: replies[i + 2] ? (replies[i + 2].totalPaid || 0) : 0
},
blocks: {
pending: replies[i + 3],
confirmed: replies[i + 4],
@ -190,16 +182,33 @@ module.exports = function(logger, portalConfig, poolConfigs){
coinStats.hashrates.forEach(function(ins){
var parts = ins.split(':');
var workerShares = parseFloat(parts[0]);
coinStats.shares += workerShares;
var worker = parts[1];
if (worker in coinStats.workers)
coinStats.workers[worker] += workerShares;
else
coinStats.workers[worker] = workerShares;
if (workerShares > 0) {
coinStats.shares += workerShares;
if (worker in coinStats.workers)
coinStats.workers[worker].shares += workerShares;
else
coinStats.workers[worker] = {
shares: workerShares,
invalidshares: 0,
hashrateString: null
};
}
else {
if (worker in coinStats.workers)
coinStats.workers[worker].invalidshares -= workerShares; // workerShares is negative number!
else
coinStats.workers[worker] = {
shares: 0,
invalidshares: -workerShares,
hashrateString: null
};
}
});
var shareMultiplier = algos[coinStats.algorithm].multiplier || 0;
var hashratePre = shareMultiplier * coinStats.shares / portalConfig.website.stats.hashrateWindow;
coinStats.hashrate = hashratePre | 0;
var shareMultiplier = Math.pow(2, 32) / algos[coinStats.algorithm].multiplier;
coinStats.hashrate = shareMultiplier * coinStats.shares / portalConfig.website.stats.hashrateWindow;
coinStats.workerCount = Object.keys(coinStats.workers).length;
portalStats.global.workers += coinStats.workerCount;
@ -215,6 +224,10 @@ module.exports = function(logger, portalConfig, poolConfigs){
portalStats.algos[algo].hashrate += coinStats.hashrate;
portalStats.algos[algo].workers += Object.keys(coinStats.workers).length;
for (var worker in coinStats.workers) {
coinStats.workers[worker].hashrateString = _this.getReadableHashRateString(shareMultiplier * coinStats.workers[worker].shares / portalConfig.website.stats.hashrateWindow);
}
delete coinStats.hashrates;
delete coinStats.shares;
coinStats.hashrateString = _this.getReadableHashRateString(coinStats.hashrate);
@ -245,8 +258,6 @@ module.exports = function(logger, portalConfig, poolConfigs){
}
}
deflateStatPoolHistory();
redisStats.multi([
['zadd', 'statHistory', statGatherTime, _this.statsString],
['zremrangebyscore', 'statHistory', '-inf', '(' + retentionTime]

View File

@ -3,18 +3,24 @@ var fs = require('fs');
var path = require('path');
var async = require('async');
var watch = require('node-watch');
var redis = require('redis');
var dot = require('dot');
var express = require('express');
var bodyParser = require('body-parser');
var compress = require('compression');
var watch = require('node-watch');
var Stratum = require('stratum-pool');
var util = require('stratum-pool/lib/util.js');
var api = require('./api.js');
module.exports = function(logger){
dot.templateSettings.strip = false;
var portalConfig = JSON.parse(process.env.portalConfig);
var poolConfigs = JSON.parse(process.env.pools);
@ -31,8 +37,11 @@ module.exports = function(logger){
'home.html': '',
'getting_started.html': 'getting_started',
'stats.html': 'stats',
'tbs.html': 'tbs',
'workers.html': 'workers',
'api.html': 'api',
'admin.html': 'admin'
'admin.html': 'admin',
'mining_key.html': 'mining_key'
};
var pageTemplates = {};
@ -40,6 +49,9 @@ module.exports = function(logger){
var pageProcessed = {};
var indexesProcessed = {};
var keyScriptTemplate = '';
var keyScriptProcessed = '';
var processTemplates = function(){
@ -112,6 +124,89 @@ module.exports = function(logger){
setInterval(buildUpdatedWebsite, websiteConfig.stats.updateInterval * 1000);
var buildKeyScriptPage = function(){
async.waterfall([
function(callback){
var client = redis.createClient(portalConfig.redis.port, portalConfig.redis.host);
client.hgetall('coinVersionBytes', function(err, coinBytes){
if (err){
client.quit();
return callback('Failed grabbing coin version bytes from redis ' + JSON.stringify(err));
}
callback(null, client, coinBytes || {});
});
},
function (client, coinBytes, callback){
var enabledCoins = Object.keys(poolConfigs).map(function(c){return c.toLowerCase()});
var missingCoins = [];
enabledCoins.forEach(function(c){
if (!(c in coinBytes))
missingCoins.push(c);
});
callback(null, client, coinBytes, missingCoins);
},
function(client, coinBytes, missingCoins, callback){
var coinsForRedis = {};
async.each(missingCoins, function(c, cback){
var coinInfo = (function(){
for (var pName in poolConfigs){
if (pName.toLowerCase() === c)
return {
daemon: poolConfigs[pName].paymentProcessing.daemon,
address: poolConfigs[pName].address
}
}
})();
var daemon = new Stratum.daemon.interface([coinInfo.daemon], function(severity, message){
logger[severity](logSystem, c, message);
});
daemon.cmd('dumpprivkey', [coinInfo.address], function(result){
if (result[0].error){
logger.error(logSystem, c, 'Could not dumpprivkey for ' + c + ' ' + JSON.stringify(result[0].error));
cback();
return;
}
var vBytePub = util.getVersionByte(coinInfo.address)[0];
var vBytePriv = util.getVersionByte(result[0].response)[0];
coinBytes[c] = vBytePub.toString() + ',' + vBytePriv.toString();
coinsForRedis[c] = coinBytes[c];
cback();
});
}, function(err){
callback(null, client, coinBytes, coinsForRedis);
});
},
function(client, coinBytes, coinsForRedis, callback){
if (Object.keys(coinsForRedis).length > 0){
client.hmset('coinVersionBytes', coinsForRedis, function(err){
if (err)
logger.error(logSystem, 'Init', 'Failed inserting coin byte version into redis ' + JSON.stringify(err));
client.quit();
});
}
else{
client.quit();
}
callback(null, coinBytes);
}
], function(err, coinBytes){
if (err){
logger.error(logSystem, 'Init', err);
return;
}
try{
keyScriptTemplate = dot.template(fs.readFileSync('website/key.html', {encoding: 'utf8'}));
keyScriptProcessed = keyScriptTemplate({coins: coinBytes});
}
catch(e){
logger.error(logSystem, 'Init', 'Failed to read key.html file');
}
});
};
buildKeyScriptPage();
var getPage = function(pageId){
if (pageId in pageProcessed){
@ -123,6 +218,7 @@ module.exports = function(logger){
var route = function(req, res, next){
var pageId = req.params.page || '';
if (pageId in indexesProcessed){
res.header('Content-Type', 'text/html');
res.end(indexesProcessed[pageId]);
}
else
@ -146,6 +242,10 @@ module.exports = function(logger){
next();
});
app.get('/key.html', function(req, res, next){
res.end(keyScriptProcessed);
});
app.get('/:page', route);
app.get('/', route);
@ -176,9 +276,15 @@ module.exports = function(logger){
res.send(500, 'Something broke!');
});
app.listen(portalConfig.website.port, function(){
logger.debug(logSystem, 'Server', 'Website started on port ' + portalConfig.website.port);
});
try {
app.listen(portalConfig.website.port, portalConfig.website.host, function () {
logger.debug(logSystem, 'Server', 'Website started on ' + portalConfig.website.host + ':' + portalConfig.website.port);
});
}
catch(e){
logger.error(logSystem, 'Server', 'Could not start website on ' + portalConfig.website.host + ':' + portalConfig.website.port
+ ' - its either in use or you do not have permission');
}
};
};

View File

@ -1,26 +0,0 @@
var events = require('events');
var cluster = require('cluster');
var MposCompatibility = require('./mposCompatibility.js');
var ShareProcessor = require('./shareProcessor.js');
var processor = module.exports = function processor(logger, poolConfigs){
var _this = this;
this.init = function(){
Object.keys(cluster.workers).forEach(function(id) {
cluster.workers[id].on('message', function(data){
switch(data.type){
}
});
});
}
};
processor.prototype.__proto__ = events.EventEmitter.prototype;

View File

@ -1,6 +1,6 @@
{
"name": "node-open-mining-portal",
"version": "0.0.3",
"version": "0.0.4",
"description": "An extremely efficient, highly scalable, all-in-one, easy to setup cryptocurrency mining pool",
"keywords": [
"stratum",
@ -42,9 +42,13 @@
"compression": "*",
"dot": "*",
"colors": "*",
"node-watch": "*"
"node-watch": "*",
"request": "*",
"nonce": "*",
"bignum": "*",
"extend": "*"
},
"engines": {
"node": ">=0.10"
}
}
}

View File

@ -3,52 +3,22 @@
"coin": "litecoin.json",
"address": "n4jSe18kZMCdGcZqaYprShXW6EH1wivUK1",
"blockRefreshInterval": 1000,
"txRefreshInterval": 20000,
"jobRebroadcastTimeout": 55,
"connectionTimeout": 600,
"emitInvalidBlockHashes": false,
"shareVariancePercent": 15,
"shareProcessing": {
"internal": {
"enabled": true,
"validateWorkerAddress": true,
"paymentInterval": 20,
"minimumPayment": 70,
"minimumReserve": 10,
"feePercent": 0.05,
"feeCollectAccount": "feesCollected",
"feeReceiveAddress": "mppaGeNaSbG1Q7S6V3gL5uJztMhucgL9Vh",
"feeWithdrawalThreshold": 5,
"daemon": {
"host": "localhost",
"port": 19332,
"user": "litecoinrpc",
"password": "testnet"
},
"redis": {
"host": "localhost",
"port": 6379
}
},
"mpos": {
"enabled": false,
"host": "localhost",
"port": 3306,
"user": "me",
"password": "mypass",
"database": "ltc",
"stratumAuth": "password"
}
"rewardRecipients": {
"n37vuNFkXfk15uFnGoVyHZ6PYQxppD3QqK": 1.5,
"22851477d63a085dbc2398c8430af1c09e7343f6": 0.1
},
"banning": {
"paymentProcessing": {
"enabled": true,
"time": 600,
"invalidPercent": 50,
"checkThreshold": 500,
"purgeInterval": 300
"paymentInterval": 20,
"minimumPayment": 70,
"daemon": {
"host": "127.0.0.1",
"port": 19332,
"user": "testuser",
"password": "testpass"
}
},
"ports": {
@ -72,24 +42,29 @@
"daemons": [
{
"host": "localhost",
"host": "127.0.0.1",
"port": 19332,
"user": "litecoinrpc",
"password": "testnet"
},
{
"host": "localhost",
"port": 19344,
"user": "litecoinrpc",
"password": "testnet"
"user": "testuser",
"password": "testpass"
}
],
"p2p": {
"enabled": false,
"host": "localhost",
"enabled": true,
"host": "127.0.0.1",
"port": 19333,
"protocolVersion": 70002,
"magic": "fcc1b7dc"
"disableTransactions": true
},
"mposMode": {
"enabled": false,
"host": "127.0.0.1",
"port": 3306,
"user": "me",
"password": "mypass",
"database": "ltc",
"checkPassword": true,
"autoCreateWorker": false
}
}
}

View File

@ -1,34 +0,0 @@
/*
This script should be hooked to the coin daemon as follow:
litecoind -blocknotify="node /path/to/this/script/blockNotify.js localhost:8117 password litecoin %s"
The above will send tell litecoin to launch this script with those parameters every time a block is found.
This script will then send the blockhash along with other information to a listening tcp socket
*/
var net = require('net');
var config = process.argv[2];
var parts = config.split(':');
var host = parts[0];
var port = parts[1];
var password = process.argv[3];
var coin = process.argv[4];
var blockHash = process.argv[5];
var client = net.connect(port, host, function () {
console.log('client connected');
client.write(JSON.stringify({
password: password,
coin: coin,
hash: blockHash
}) + '\n');
});
client.on('data', function (data) {
console.log(data.toString());
//client.end();
});
client.on('end', function () {
console.log('client disconnected');
//process.exit();
});

Some files were not shown because too many files have changed in this diff Show More