Compare commits

...

77 Commits

Author SHA1 Message Date
Janus
c3bf4673ee lightning: scalarmult result is just a hash, not pubkey 2018-03-15 12:56:09 +01:00
Janus
c2618ecae2 lightning: either KeyDesc has KeyLocator, or it has PubKey, no need to handle both simultaneously 2018-03-15 12:19:49 +01:00
Janus
63aa8bd89e lightning: in lnd sources, idx/fam==0 means no derivation 2018-03-15 12:05:39 +01:00
Janus
251e5d0fd2 lightning: in derivePrivKey, only do not HD derive if there is a pubkey 2018-03-15 11:50:53 +01:00
Janus
3bc81164a1 lightning: fetchPrivKey should also be able to not HD derive with None path arguments 2018-03-15 11:32:24 +01:00
Janus
33318b12e0 lightning: zero keylocator means no fancy derivation, pass keydesc argument correctly 2018-03-15 01:20:56 +01:00
Janus
40005dc00d lightning: use derivePrivKey in signOutputRaw 2018-03-15 01:13:11 +01:00
Janus
b017301f84 lightning: another case of moving of pubkey to signdesc's keydesc 2018-03-15 01:01:35 +01:00
Janus
2ade42f356 lightning: fix pubkey from signdesc in SignOutputRaw 2018-03-15 00:52:04 +01:00
Janus
bba986c608 lightning: deserialize_privkey does not return int as in tuple idx 1 2018-03-14 18:50:01 +01:00
Janus
cf0cf4e585 lightning: ComputeInputScript: signDesc has keyDesc now, no raw pubKey 2018-03-14 18:44:06 +01:00
Janus
8f67179497 lightning: centralized key search 2018-03-14 18:34:12 +01:00
Janus
8f529047d4 lightning: DerivePrivKey tries keys that were previously generated by DeriveNextKey 2018-03-14 17:42:34 +01:00
Janus
721cb1ad0b lightning: pass trace to client 2018-03-14 17:03:09 +01:00
Janus
567fba0edb lightning: hex to bytes 2018-03-14 16:23:34 +01:00
Janus
abc797828c lightning: use EC_KEY.get_public_key to get compressed key 2018-03-14 16:12:06 +01:00
Janus
10b5e825e0 lightning: avoid secret_multiplier, just use secret 2018-03-14 15:55:15 +01:00
Janus
eaf51d59c3 lightning: use EC_KEY because it has better sign interface, add convenience method for 32-byte pubkey 2018-03-14 14:41:44 +01:00
Janus
f434617084 lightning: use SigningKey instead of EC_KEY 2018-03-14 14:28:03 +01:00
Janus
9df0840e8f lightning: take get 33-byte pubkey through EC_KEY.privkey.get_verifying_key().to_string() 2018-03-14 14:16:39 +01:00
Janus
8041ec484f lightning: fix DeriveKey pubkey 2018-03-14 12:24:10 +01:00
Janus
75a0a725c4 lightning: ScalarMult: use 32-byte hash to construct priv-key and derive 33-byte pubkey 2018-03-14 10:00:15 +01:00
Janus
4f169886f8 lightning: big-endian privkey encoding in DeriveNextKey 2018-03-13 17:08:36 +01:00
Janus
ff6d1049d0 lightning: correct capitalization 2018-03-13 15:31:38 +01:00
Janus
44ddac371f lightning: fix handling of secret_multiplier 2018-03-13 15:18:53 +01:00
Janus
8624552268 lightning: use decoded object properties 2018-03-13 14:48:40 +01:00
Janus
e718fb0834 lightning: attempt implementing SecretKeyRing 2018-03-09 21:30:43 +01:00
Janus
13cf439dd2 lightning: actual SecretKeyRing stubs 2018-03-08 17:54:15 +01:00
Janus
f4a54881f9 lightning: adapt to new deterministic lnd key interface (only stub) 2018-03-08 17:34:59 +01:00
Janus
640ef1ff0a lightning: kivy: clear subscribers correctly, avoid syntax error 2018-03-08 17:33:18 +01:00
Janus
9460ef5cac lightning: kivy: channel list prototype 2018-03-07 12:54:11 +01:00
Janus
fff07e6a91 lightning: kivy: add lightning qr scan handler stub 2018-03-06 15:14:13 +01:00
Janus
df8de2c2f1 lightning: kivy channels screen stub 2018-03-06 14:55:39 +01:00
Janus
1694bf3d56 lightning: kivy lightning send invoice ui 2018-03-05 13:43:48 +01:00
Janus
4635d37acf lightning: remove generated files 2018-03-03 19:07:23 +01:00
Janus
32940fdea8 lightning: minimal qt invoice gui should work 2018-03-03 16:00:57 +01:00
Janus
77164249f3 lightning: qt invoice list 2018-03-02 17:52:12 +01:00
Janus
5a9a624857 lightning: fix newline count 2018-03-01 16:22:30 +01:00
Janus
ac52c40857 lightning: send newlines after messages 2018-03-01 16:17:57 +01:00
Janus
a396df4f5d lightning: print invoice updates 2018-03-01 12:26:34 +01:00
Janus
559d12e3a3 lightning: polish timeouts on 1080 conn 2018-02-21 15:29:31 +01:00
Janus
f219af81be asyncio: try CA signed certificate first, previous behaviour totally broken 2018-02-21 11:32:49 +01:00
Janus
eed40a9a41 lightning: ten minute timeout 2018-02-21 00:28:58 +01:00
Janus
71153fd087 lightning: writeDb stub instead of setHdSeed 2018-02-19 16:35:34 +01:00
Janus
3ab42b3606 lightning: call coroutines on callback queue correctly 2018-02-15 12:18:01 +01:00
Janus
91083c5f68 lightning: add send_async, asynchronous_get, broadcast_async 2018-02-15 10:09:18 +01:00
Janus
0e29b76231 lightning: fix lightning arguments for subcommands, again 2018-02-13 11:48:45 +01:00
Janus
40f8c6152e lightning: fix lightning subcommand arguments 2018-02-13 11:45:07 +01:00
Janus
1cbb750e02 lightning: enable arbitrary arguments through json (over stdin) 2018-02-13 11:15:28 +01:00
Janus
3f069fb25f asyncio: fix boolean expression for stopping 2018-02-12 16:16:17 +01:00
Janus
b427f55caf asyncio: use is_running of interface instead of global stop flag 2018-02-12 16:02:35 +01:00
Janus
6154e93222 lightning: 30 sec command timeout 2018-02-12 14:53:31 +01:00
Janus
53bc89fb96 lightning: complain on encrypted wallet 2018-02-09 14:18:24 +01:00
Janus
21e3e87419 lightning: timeout cli lightning calls in 10 sec 2018-02-06 17:23:00 +01:00
Janus
6bdec74a8f protoc_lightning: make lib/ln and __init__.py 2018-02-06 16:25:52 +01:00
Janus
8e7377b66f protoc: include ~/include 2018-02-06 13:37:22 +01:00
Janus
94fe66dbef use protoc in ~/go/bin 2018-02-06 12:31:12 +01:00
Janus
6ed9348d4b lightning: enable usage through daemon 2018-02-01 16:57:52 +01:00
Janus
0d26188498 lightning: rebased on Jan '18 asyncio 2018-02-01 12:05:47 +01:00
Janus
94f64f8ce0 asyncio: remove lol.py test script 2018-02-01 12:02:42 +01:00
Janus
7493d51b03 asyncio: remove out.perf 2018-02-01 12:01:54 +01:00
Janus
87075a7a11 asyncio: remove autogenerated 2018-02-01 12:00:42 +01:00
Janus
0a4d41f8aa asyncio: remove remaining requested_chunks leftover 2018-02-01 10:59:12 +01:00
Janus
387981a642 asyncio: don't close loop explicitly (prevent callback calls from throwing), ignore already disconnected servers 2018-02-01 10:54:30 +01:00
Janus
3e2881bcfc asyncio: add locks for more robust network handling 2018-02-01 10:54:30 +01:00
Janus
1cfdcf4e25 remove dead testnet servers 2018-02-01 10:54:30 +01:00
Janus
37f1e3bd95 asyncio: more robost network connection handling, shorter timeouts 2018-02-01 10:54:30 +01:00
Janus
1555100632 asyncio: fix process_pending_sends_job, remove stale comment, remove debug output 2018-02-01 10:54:11 +01:00
Janus
2d1ccfcc69 asyncio: support switching servers 2018-02-01 10:53:44 +01:00
Janus
683205a3fa asyncio: warn if sending takes too long, only output errors if not shutting down 2018-02-01 10:53:44 +01:00
Janus
dcb0a24e6f asyncio: more graceful shutdown 2018-02-01 10:53:44 +01:00
Janus
200a085778 asyncio: do not pin CA certificates, poll for cert differently 2018-02-01 10:53:44 +01:00
Janus
cfbc4422da asyncio: fix off-by-one in ssl_in_socks, style fixes 2018-02-01 10:53:44 +01:00
Janus
3ffedf83fc asyncio: try interfaces in parallel 2018-02-01 10:53:44 +01:00
Janus
88f906bc2a asyncio: avoid StreamReader.readuntil 2018-02-01 10:53:44 +01:00
ThomasV
ef9236dd0c asyncio: update requirements.txt 2018-02-01 10:53:44 +01:00
Janus
e170f4c7d3 use asyncio in network layer 2018-02-01 10:53:44 +01:00
27 changed files with 2140 additions and 647 deletions

View File

@ -7,7 +7,7 @@ jsonrpclib-pelix==0.3.1
pbkdf2==1.3
protobuf==3.5.0.post1
pyaes==1.6.1
PySocks==1.6.7
aiosocks==0.2.6
qrcode==5.3
requests==2.18.4
six==1.11.0

View File

@ -387,6 +387,12 @@
AddressScreen:
id: address_screen
tab: address_tab
LightningPayerScreen:
id: lightning_payer_screen
tab: lightning_payer_tab
LightningChannelsScreen:
id: lightning_channels_screen
tab: lightning_channels_tab
CleanHeader:
id: invoices_tab
text: _('Invoices')
@ -407,6 +413,14 @@
id: address_tab
text: _('Addresses')
slide: 4
CleanHeader:
id: lightning_payer_tab
text: _('Send Lightning Payment')
slide: 5
CleanHeader:
id: lightning_channels_tab
text: _('Lightning Channels')
slide: 6
<ActionOvrButton@ActionButton>

View File

@ -21,6 +21,7 @@ from electrum.util import profiler, parse_URI, format_time, InvalidPassword, Not
from electrum import bitcoin
from electrum.util import timestamp_to_datetime
from electrum.paymentrequest import PR_UNPAID, PR_PAID, PR_UNKNOWN, PR_EXPIRED
import electrum.lightning as lightning
from .context_menu import ContextMenu
@ -589,6 +590,59 @@ class AddressScreen(CScreen):
def ext_search(self, card, search):
return card.memo.find(search) >= 0 or card.amount.find(search) >= 0
class LightningChannelsScreen(CScreen):
kvname = "lightning_channels"
def __init__(self):
super(LightningChannelsScreen, self).__init__(*args, **kwargs)
self.clocks = []
def on_activate(self, *args, **kwargs):
super(LightningChannelsScreen, self).on_activate(*args, **kwargs)
for i in self.clocks: i.cancel()
self.clocks.append(Clock.schedule_interval(self.fetch_channels, 10))
self.app.wallet.lightning.subscribe(self.rpc_result_handler)
def on_deactivate(self, *args, **kwargs):
self.clearSubscribers()
def fetch_channels(self, dw):
lightning.lightningCall(self.app.wallet.lightning, "listchannels")()
def rpc_result_handler(self, res):
if isinstance(res, Exception):
raise res
channel_cards = self.screen.ids.lightning_channels_container
channels_cards.clear_widgets()
for i in res["channels"]:
item = Factory.LightningChannelItem()
item.screen = self
item.channelId = i.channelId
channel_cards.add_widget(item)
class LightningPayerScreen(CScreen):
kvname = 'lightning_payer'
def on_activate(self, *args, **kwargs):
super(LightningPayerScreen, self).on_activate(*args, **kwargs)
class FakeQtSignal:
def emit(self2, data):
self.app.show_info(data)
class MyConsole:
newResult = FakeQtSignal()
self.app.wallet.lightning.setConsole(MyConsole())
def on_deactivate(self, *args, **kwargs):
self.app.wallet.lightning.setConsole(None)
def do_paste_sample(self):
self.screen.invoice_data = "lnbc1pvjluezpp5qqqsyqcyq5rqwzqfqqqsyqcyq5rqwzqfqqqsyqcyq5rqwzqfqypqdpl2pkx2ctnv5sxxmmwwd5kgetjypeh2ursdae8g6twvus8g6rfwvs8qun0dfjkxaq8rkx3yf5tcsyz3d73gafnh3cax9rn449d9p5uxz9ezhhypd0elx87sjle52x86fux2ypatgddc6k63n7erqz25le42c4u4ecky03ylcqca784w"
def do_paste(self):
contents = self.app._clipboard.paste()
if not contents:
self.app.show_info(_("Clipboard is empty"))
return
self.screen.invoice_data = contents
def do_clear(self):
self.screen.invoice_data = ""
def do_pay(self):
lightning.lightningCall(self.app.wallet.lightning, "sendpayment")("--pay_req=" + self.screen.invoice_data)
def on_lightning_qr(self):
self.app.show_info("Lightning Invoice QR scanning not implemented") # TODO

View File

@ -0,0 +1,18 @@
<LightningChannelItem@CardItem>
channelId: '<channelId not set>'
Label:
text: root.channelId
LightningChannelsScreen:
name: 'lightning_channels'
BoxLayout:
orientation: 'vertical'
spacing: '1dp'
ScrollView:
GridLayout:
cols: 1
id: lightning_channels_container
size_hint: 1, None
height: self.minimum_height
spacing: '2dp'
padding: '12dp'

View File

@ -0,0 +1,32 @@
LightningPayerScreen:
id: s
name: 'lightning_payer'
invoice_data: ''
BoxLayout:
orientation: "vertical"
BlueButton:
text: s.invoice_data if s.invoice_data else _('Lightning invoice')
shorten: True
on_release: Clock.schedule_once(lambda dt: app.show_info(_('Copy and paste the lightning invoice using the Paste button, or use the camera to scan a QR code.')))
GridLayout:
cols: 4
size_hint: 1, None
height: '48dp'
IconButton:
id: qr
on_release: Clock.schedule_once(lambda dt: app.scan_qr(on_complete=s.on_lightning_qr))
icon: 'atlas://gui/kivy/theming/light/camera'
Button:
text: _('Paste')
on_release: s.parent.do_paste()
Button:
text: _('Paste sample')
on_release: s.parent.do_paste_sample()
Button:
text: _('Clear')
on_release: s.parent.do_clear()
Button:
size_hint: 1, None
height: '48dp'
text: _('Pay pasted/scanned invoice')
on_release: s.parent.do_pay()

View File

@ -25,7 +25,7 @@
import signal
import sys
import traceback
try:
import PyQt5
@ -47,7 +47,7 @@ from electrum.util import UserCancelled, print_error
# from electrum.wallet import Abstract_Wallet
from .installwizard import InstallWizard, GoBack
from electrum.lightning import LightningUI
try:
from . import icons_rc
@ -92,6 +92,10 @@ class ElectrumGui:
#network.add_jobs([DebugMem([Abstract_Wallet, SPV, Synchronizer,
# ElectrumWindow], interval=5)])
QtCore.QCoreApplication.setAttribute(QtCore.Qt.AA_X11InitThreads)
def setConsoleAndReturnLightning():
self.windows[0].wallet.lightning.setConsole(self.windows[0].console)
return self.windows[0].wallet.lightning
self.lightning = LightningUI(setConsoleAndReturnLightning)
if hasattr(QtCore.Qt, "AA_ShareOpenGLContexts"):
QtCore.QCoreApplication.setAttribute(QtCore.Qt.AA_ShareOpenGLContexts)
self.config = config
@ -191,6 +195,7 @@ class ElectrumGui:
try:
wallet = self.daemon.load_wallet(path, None)
except BaseException as e:
traceback.print_exc()
d = QMessageBox(QMessageBox.Warning, _('Error'), 'Cannot load wallet:\n' + str(e))
d.exec_()
return

View File

@ -18,6 +18,8 @@ else:
class Console(QtWidgets.QPlainTextEdit):
newResult = QtCore.pyqtSignal(str)
def __init__(self, prompt='>> ', startup_message='', parent=None):
QtWidgets.QPlainTextEdit.__init__(self, parent)
@ -34,6 +36,7 @@ class Console(QtWidgets.QPlainTextEdit):
self.updateNamespace({'run':self.run_script})
self.set_json(False)
self.newResult.connect(self.handleNewResult)
def set_json(self, b):
self.is_json = b
@ -45,7 +48,8 @@ class Console(QtWidgets.QPlainTextEdit):
# eval is generally considered bad practice. use it wisely!
result = eval(script, self.namespace, self.namespace)
def handleNewResult(self, msg):
self.showMessage(msg)
def updateNamespace(self, namespace):
self.namespace.update(namespace)

View File

@ -57,6 +57,8 @@ class Exception_Window(QWidget):
def __init__(self, main_window, exctype, value, tb):
self.exc_args = (exctype, value, tb)
print(self.get_traceback_info()["exc_string"])
print(self.get_traceback_info()["stack"])
self.main_window = main_window
QWidget.__init__(self)
self.setWindowTitle('Electrum - ' + _('An Error Occured'))

View File

@ -0,0 +1,147 @@
# -*- coding: utf-8 -*-
import base64
import binascii
from PyQt5 import QtCore, QtWidgets
from collections import OrderedDict
import logging
from electrum.lightning import lightningCall
mapping = {0: "r_hash", 1: "pay_req", 2: "settled"}
revMapp = {"r_hash": 0, "pay_req": 1, "settled": 2}
datatable = OrderedDict([])
idx = 0
class MyTableRow(QtWidgets.QTreeWidgetItem):
def __init__(self, di):
if "settled" not in di:
di["settled"] = False
strs = [str(di[mapping[key]]) for key in range(len(mapping))]
print(strs)
super(MyTableRow, self).__init__(strs)
assert isinstance(di, dict)
self.di = di
def __getitem__(self, idx):
return self.di[idx]
def __setitem__(self, idx, val):
self.di[idx] = val
try:
self.setData(revMapp[idx], QtCore.Qt.DisplayRole, '{0}'.format(val))
except KeyError:
logging.warning("Lightning Invoice field %s unknown", idx)
def __str__(self):
return str(self.di)
def addInvoiceRow(new):
made = MyTableRow(new)
datatable[new["r_hash"]] = made
datatable.move_to_end(new["r_hash"], last=False)
return made
def clickHandler(numInput, treeView, lightningRpc):
amt = numInput.value()
if amt < 1:
print("value too small")
return
print("creating invoice with value {}".format(amt))
global idx
#obj = {
# "r_hash": binascii.hexlify((int.from_bytes(bytearray.fromhex("9500edb0994b7bc23349193486b25c82097045db641f35fa988c0e849acdec29"), "big")+idx).to_bytes(byteorder="big", length=32)).decode("ascii"),
# "pay_req": "lntb81920n1pdf258s" + str(idx),
# "settled": False
#}
#treeView.insertTopLevelItem(0, addInvoiceRow(obj))
idx += 1
lightningCall(lightningRpc, "addinvoice")("--amt=" + str(amt))
class LightningInvoiceList(QtWidgets.QWidget):
def create_menu(self, position):
menu = QtWidgets.QMenu()
pay_req = self._tv.currentItem()["pay_req"]
cb = QtWidgets.QApplication.instance().clipboard()
def copy():
print(pay_req)
cb.setText(pay_req)
menu.addAction("Copy payment request", copy)
menu.exec_(self._tv.viewport().mapToGlobal(position))
def lightningWorkerHandler(self, sourceClassName, obj):
new = {}
for k, v in obj.items():
try:
v = binascii.hexlify(base64.b64decode(v)).decode("ascii")
except:
pass
new[k] = v
try:
obj = datatable[new["r_hash"]]
except KeyError:
print("lightning payment invoice r_hash {} unknown!".format(new["r_hash"]))
else:
for k, v in new.items():
try:
if obj[k] != v: obj[k] = v
except KeyError:
obj[k] = v
def lightningRpcHandler(self, methodName, obj):
if methodName != "addinvoice":
print("ignoring reply {} to {}".format(obj, methodName))
return
self._tv.insertTopLevelItem(0, addInvoiceRow(obj))
def __init__(self, parent, lightningWorker, lightningRpc):
QtWidgets.QWidget.__init__(self, parent)
lightningWorker.subscribe(self.lightningWorkerHandler)
lightningRpc.subscribe(self.lightningRpcHandler)
self._tv=QtWidgets.QTreeWidget(self)
self._tv.setHeaderLabels([mapping[i] for i in range(len(mapping))])
self._tv.setColumnCount(len(mapping))
self._tv.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
self._tv.customContextMenuRequested.connect(self.create_menu)
class SatoshiCountSpinBox(QtWidgets.QSpinBox):
def keyPressEvent(self2, e):
super(SatoshiCountSpinBox, self2).keyPressEvent(e)
if QtCore.Qt.Key_Return == e.key():
clickHandler(self2, self._tv, lightningRpc)
numInput = SatoshiCountSpinBox(self)
button = QtWidgets.QPushButton('Add invoice', self)
button.clicked.connect(lambda: clickHandler(numInput, self._tv, lightningRpc))
l=QtWidgets.QVBoxLayout(self)
h=QtWidgets.QGridLayout(self)
h.addWidget(numInput, 0, 0)
h.addWidget(button, 0, 1)
#h.addItem(QtWidgets.QSpacerItem(100, 200, QtWidgets.QSizePolicy.Preferred, QtWidgets.QSizePolicy.Preferred), 0, 2)
#h.setSizePolicy(
h.setColumnStretch(0, 1)
h.setColumnStretch(1, 1)
h.setColumnStretch(2, 2)
l.addLayout(h)
l.addWidget(self._tv)
self.resize(2500,1000)
def tick():
key = "9500edb0994b7bc23349193486b25c82097045db641f35fa988c0e849acdec29"
if not key in datatable:
return
row = datatable[key]
row["settled"] = not row["settled"]
print("data changed")
if __name__=="__main__":
from sys import argv, exit
a=QtWidgets.QApplication(argv)
w=LightningInvoiceList()
w.show()
w.raise_()
timer = QtCore.QTimer()
timer.timeout.connect(tick)
timer.start(1000)
exit(a.exec_())

View File

@ -147,6 +147,9 @@ class ElectrumWindow(QMainWindow, MessageBoxMixin, PrintError):
tabs.addTab(self.send_tab, QIcon(":icons/tab_send.png"), _('Send'))
tabs.addTab(self.receive_tab, QIcon(":icons/tab_receive.png"), _('Receive'))
self.lightning_invoices_tab = self.create_lightning_invoices_tab(wallet)
tabs.addTab(self.lightning_invoices_tab, _("Lightning Invoices"))
def add_optional_tab(tabs, tab, icon, description, name):
tab.tab_icon = icon
tab.tab_description = description
@ -750,6 +753,11 @@ class ElectrumWindow(QMainWindow, MessageBoxMixin, PrintError):
self.invoice_list.update()
self.update_completions()
def create_lightning_invoices_tab(self, wallet):
from .lightning_invoice_list import LightningInvoiceList
self.lightning_invoice_list = LightningInvoiceList(self, wallet.lightningworker, wallet.lightning)
return self.lightning_invoice_list
def create_history_tab(self):
from .history_list import HistoryList
self.history_list = l = HistoryList(self)
@ -1874,6 +1882,7 @@ class ElectrumWindow(QMainWindow, MessageBoxMixin, PrintError):
console.updateNamespace({'wallet' : self.wallet,
'network' : self.network,
'plugins' : self.gui_object.plugins,
'lightning' : self.gui_object.lightning,
'window': self})
console.updateNamespace({'util' : util, 'bitcoin':bitcoin})

View File

@ -4,7 +4,7 @@ from .wallet import Synchronizer, Wallet
from .storage import WalletStorage
from .coinchooser import COIN_CHOOSERS
from .network import Network, pick_random_server
from .interface import Connection, Interface
from .interface import Interface
from .simple_config import SimpleConfig, get_config, set_config
from . import bitcoin
from . import transaction

View File

@ -70,6 +70,15 @@ XPUB_HEADERS = {
class NetworkConstants:
@classmethod
def set_simnet(cls):
cls.TESTNET = True
cls.ADDRTYPE_P2PKH = 0x3f
cls.ADDRTYPE_P2SH = 0x7b
cls.SEGWIT_HRP = "sb"
cls.GENESIS = "683e86bd5c6d110d91b94b97137ba6bfe02dbbdb8e3dff722a669b5d69d77af6"
cls.DEFAULT_PORTS = {'t':'50001', 's':'50002'}
cls.DEFAULT_SERVERS = { '127.0.0.1': cls.DEFAULT_PORTS, }
@classmethod
def set_mainnet(cls):
@ -755,11 +764,10 @@ class EC_KEY(object):
def get_public_key(self, compressed=True):
return bh2u(point_to_ser(self.pubkey.point, compressed))
def sign(self, msg_hash):
def sign(self, msg_hash, sigencode=ecdsa.util.sigencode_string):
private_key = MySigningKey.from_secret_exponent(self.secret, curve = SECP256k1)
public_key = private_key.get_verifying_key()
signature = private_key.sign_digest_deterministic(msg_hash, hashfunc=hashlib.sha256, sigencode = ecdsa.util.sigencode_string)
assert public_key.verify_digest(signature, msg_hash, sigdecode = ecdsa.util.sigdecode_string)
signature = private_key.sign_digest_deterministic(msg_hash, hashfunc=hashlib.sha256, sigencode = sigencode)
return signature
def sign_message(self, message, is_compressed):

View File

@ -23,6 +23,7 @@
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import queue
import sys
import datetime
import copy
@ -41,6 +42,7 @@ from .i18n import _
from .transaction import Transaction, multisig_script
from .paymentrequest import PR_PAID, PR_UNPAID, PR_UNKNOWN, PR_EXPIRED
from .plugins import run_hook
from .import lightning
known_commands = {}
@ -691,6 +693,22 @@ class Commands:
# for the python console
return sorted(known_commands.keys())
@command("wn")
def lightning(self, lcmd, lightningargs=None):
q = queue.Queue()
class FakeQtSignal:
def emit(self, data):
q.put(data)
class MyConsole:
newResult = FakeQtSignal()
self.wallet.lightning.setConsole(MyConsole())
if lightningargs:
lightningargs = json_decode(lightningargs)
else:
lightningargs = []
lightning.lightningCall(self.wallet.lightning, lcmd)(*lightningargs)
return q.get(block=True, timeout=600)
param_descriptions = {
'privkey': 'Private key. Type \'?\' to get a prompt.',
'destination': 'Bitcoin address, contact or alias',
@ -741,6 +759,7 @@ command_options = {
'pending': (None, "Show only pending requests."),
'expired': (None, "Show only expired requests."),
'paid': (None, "Show only paid requests."),
'lightningargs':(None, "Arguments for an lncli subcommand, encoded as a JSON array"),
}

View File

@ -22,287 +22,272 @@
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import aiosocks
import os
import stat
import re
import socket
import ssl
import sys
import threading
import time
import traceback
import asyncio
import json
import asyncio.streams
from asyncio.sslproto import SSLProtocol
import io
import requests
from .util import print_error
from aiosocks.errors import SocksError
from concurrent.futures import TimeoutError
ca_path = requests.certs.where()
from .util import print_error
from .ssl_in_socks import sslInSocksReaderWriter
from . import util
from . import x509
from . import pem
def Connection(server, queue, config_path):
"""Makes asynchronous connections to a remote electrum server.
Returns the running thread that is making the connection.
Once the thread has connected, it finishes, placing a tuple on the
queue of the form (server, socket), where socket is None if
connection failed.
"""
host, port, protocol = server.rsplit(':', 2)
if not protocol in 'st':
raise Exception('Unknown protocol: %s' % protocol)
c = TcpConnection(server, queue, config_path)
c.start()
return c
class TcpConnection(threading.Thread, util.PrintError):
def __init__(self, server, queue, config_path):
threading.Thread.__init__(self)
self.config_path = config_path
self.queue = queue
self.server = server
self.host, self.port, self.protocol = self.server.rsplit(':', 2)
self.host = str(self.host)
self.port = int(self.port)
self.use_ssl = (self.protocol == 's')
self.daemon = True
def diagnostic_name(self):
return self.host
def check_host_name(self, peercert, name):
"""Simple certificate/host name checker. Returns True if the
certificate matches, False otherwise. Does not support
wildcards."""
# Check that the peer has supplied a certificate.
# None/{} is not acceptable.
if not peercert:
return False
if 'subjectAltName' in peercert:
for typ, val in peercert["subjectAltName"]:
if typ == "DNS" and val == name:
return True
else:
# Only check the subject DN if there is no subject alternative
# name.
cn = None
for attr, val in peercert["subject"]:
# Use most-specific (last) commonName attribute.
if attr == "commonName":
cn = val
if cn is not None:
return cn == name
return False
def get_simple_socket(self):
try:
l = socket.getaddrinfo(self.host, self.port, socket.AF_UNSPEC, socket.SOCK_STREAM)
except socket.gaierror:
self.print_error("cannot resolve hostname")
return
e = None
for res in l:
try:
s = socket.socket(res[0], socket.SOCK_STREAM)
s.settimeout(10)
s.connect(res[4])
s.settimeout(2)
s.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
return s
except BaseException as _e:
e = _e
continue
else:
self.print_error("failed to connect", str(e))
@staticmethod
def get_ssl_context(cert_reqs, ca_certs):
context = ssl.create_default_context(purpose=ssl.Purpose.SERVER_AUTH, cafile=ca_certs)
context.check_hostname = False
context.verify_mode = cert_reqs
context.options |= ssl.OP_NO_SSLv2
context.options |= ssl.OP_NO_SSLv3
context.options |= ssl.OP_NO_TLSv1
return context
def get_socket(self):
if self.use_ssl:
cert_path = os.path.join(self.config_path, 'certs', self.host)
if not os.path.exists(cert_path):
is_new = True
s = self.get_simple_socket()
if s is None:
return
# try with CA first
try:
context = self.get_ssl_context(cert_reqs=ssl.CERT_REQUIRED, ca_certs=ca_path)
s = context.wrap_socket(s, do_handshake_on_connect=True)
except ssl.SSLError as e:
print_error(e)
s = None
except:
return
if s and self.check_host_name(s.getpeercert(), self.host):
self.print_error("SSL certificate signed by CA")
return s
# get server certificate.
# Do not use ssl.get_server_certificate because it does not work with proxy
s = self.get_simple_socket()
if s is None:
return
try:
context = self.get_ssl_context(cert_reqs=ssl.CERT_NONE, ca_certs=None)
s = context.wrap_socket(s)
except ssl.SSLError as e:
self.print_error("SSL error retrieving SSL certificate:", e)
return
except:
return
dercert = s.getpeercert(True)
s.close()
cert = ssl.DER_cert_to_PEM_cert(dercert)
# workaround android bug
cert = re.sub("([^\n])-----END CERTIFICATE-----","\\1\n-----END CERTIFICATE-----",cert)
temporary_path = cert_path + '.temp'
with open(temporary_path,"w") as f:
f.write(cert)
else:
is_new = False
s = self.get_simple_socket()
if s is None:
return
if self.use_ssl:
try:
context = self.get_ssl_context(cert_reqs=ssl.CERT_REQUIRED,
ca_certs=(temporary_path if is_new else cert_path))
s = context.wrap_socket(s, do_handshake_on_connect=True)
except socket.timeout:
self.print_error('timeout')
return
except ssl.SSLError as e:
self.print_error("SSL error:", e)
if e.errno != 1:
return
if is_new:
rej = cert_path + '.rej'
if os.path.exists(rej):
os.unlink(rej)
os.rename(temporary_path, rej)
else:
with open(cert_path) as f:
cert = f.read()
try:
b = pem.dePem(cert, 'CERTIFICATE')
x = x509.X509(b)
except:
traceback.print_exc(file=sys.stderr)
self.print_error("wrong certificate")
return
try:
x.check_date()
except:
self.print_error("certificate has expired:", cert_path)
os.unlink(cert_path)
return
self.print_error("wrong certificate")
if e.errno == 104:
return
return
except BaseException as e:
self.print_error(e)
traceback.print_exc(file=sys.stderr)
return
if is_new:
self.print_error("saving certificate")
os.rename(temporary_path, cert_path)
return s
def run(self):
socket = self.get_socket()
if socket:
self.print_error("connected")
self.queue.put((self.server, socket))
def get_ssl_context(cert_reqs, ca_certs):
context = ssl.create_default_context(purpose=ssl.Purpose.SERVER_AUTH, cafile=ca_certs)
context.check_hostname = False
context.verify_mode = cert_reqs
context.options |= ssl.OP_NO_SSLv2
context.options |= ssl.OP_NO_SSLv3
context.options |= ssl.OP_NO_TLSv1
return context
class Interface(util.PrintError):
"""The Interface class handles a socket connected to a single remote
electrum server. It's exposed API is:
- Member functions close(), fileno(), get_responses(), has_timed_out(),
ping_required(), queue_request(), send_requests()
- Member functions close(), fileno(), get_response(), has_timed_out(),
ping_required(), queue_request(), send_request()
- Member variable server.
"""
def __init__(self, server, socket):
self.server = server
self.host, _, _ = server.rsplit(':', 2)
self.socket = socket
def __init__(self, server, config_path, proxy_config, is_running):
self.is_running = is_running
self.addr = self.auth = None
if proxy_config is not None:
if proxy_config["mode"] == "socks5":
self.addr = aiosocks.Socks5Addr(proxy_config["host"], proxy_config["port"])
self.auth = aiosocks.Socks5Auth(proxy_config["user"], proxy_config["password"]) if proxy_config["user"] != "" else None
elif proxy_config["mode"] == "socks4":
self.addr = aiosocks.Socks4Addr(proxy_config["host"], proxy_config["port"])
self.auth = aiosocks.Socks4Auth(proxy_config["password"]) if proxy_config["password"] != "" else None
else:
raise Exception("proxy mode not supported")
self.pipe = util.SocketPipe(socket)
self.pipe.set_timeout(0.0) # Don't wait for data
self.server = server
self.config_path = config_path
host, port, protocol = self.server.split(':')
self.host = host
self.port = int(port)
self.use_ssl = (protocol=='s')
self.reader = self.writer = None
self.lock = asyncio.Lock()
# Dump network messages. Set at runtime from the console.
self.debug = False
self.unsent_requests = []
self.unsent_requests = asyncio.PriorityQueue()
self.unanswered_requests = {}
# Set last ping to zero to ensure immediate ping
self.last_request = time.time()
self.last_ping = 0
self.closed_remotely = False
self.buf = bytes()
def conn_coro(self, context):
return asyncio.open_connection(self.host, self.port, ssl=context)
async def _save_certificate(self, cert_path, require_ca):
dercert = None
if require_ca:
context = get_ssl_context(cert_reqs=ssl.CERT_REQUIRED, ca_certs=ca_path)
else:
context = get_ssl_context(cert_reqs=ssl.CERT_NONE, ca_certs=None)
try:
if self.addr is not None:
proto_factory = lambda: SSLProtocol(asyncio.get_event_loop(), asyncio.Protocol(), context, None)
socks_create_coro = aiosocks.create_connection(proto_factory, \
proxy=self.addr, \
proxy_auth=self.auth, \
dst=(self.host, self.port))
transport, protocol = await asyncio.wait_for(socks_create_coro, 5)
async def job(fut):
try:
if protocol._sslpipe is not None:
fut.set_result(protocol._sslpipe.ssl_object.getpeercert(True))
except BaseException as e:
fut.set_exception(e)
while self.is_running():
fut = asyncio.Future()
asyncio.ensure_future(job(fut))
try:
await fut
except:
pass
try:
fut.exception()
dercert = fut.result()
except ValueError:
await asyncio.sleep(1)
continue
except:
if self.is_running():
traceback.print_exc()
print("Previous exception from _save_certificate")
continue
break
if not self.is_running(): return
transport.close()
else:
reader, writer = await asyncio.wait_for(self.conn_coro(context), 3)
dercert = writer.get_extra_info('ssl_object').getpeercert(True)
writer.close()
except OSError as e: # not ConnectionError because we need socket.gaierror too
if self.is_running():
self.print_error(self.server, "Exception in _save_certificate", type(e))
return
except TimeoutError:
return
assert dercert
if not require_ca:
cert = ssl.DER_cert_to_PEM_cert(dercert)
else:
# Don't pin a CA signed certificate
cert = ""
temporary_path = cert_path + '.temp'
with open(temporary_path, "w") as f:
f.write(cert)
return temporary_path
async def _get_read_write(self):
async with self.lock:
if self.reader is not None and self.writer is not None:
return self.reader, self.writer, True
if self.use_ssl:
cert_path = os.path.join(self.config_path, 'certs', self.host)
if not os.path.exists(cert_path):
temporary_path = await self._save_certificate(cert_path, True)
if not temporary_path:
temporary_path = await self._save_certificate(cert_path, False)
if not temporary_path:
raise ConnectionError("Could not get certificate on second try")
is_new = True
else:
is_new = False
ca_certs = temporary_path if is_new else cert_path
size = os.stat(ca_certs)[stat.ST_SIZE]
self_signed = size != 0
if not self_signed:
ca_certs = ca_path
try:
if self.addr is not None:
if not self.use_ssl:
open_coro = aiosocks.open_connection(proxy=self.addr, proxy_auth=self.auth, dst=(self.host, self.port))
self.reader, self.writer = await asyncio.wait_for(open_coro, 5)
else:
ssl_in_socks_coro = sslInSocksReaderWriter(self.addr, self.auth, self.host, self.port, ca_certs)
self.reader, self.writer = await asyncio.wait_for(ssl_in_socks_coro, 5)
else:
context = get_ssl_context(cert_reqs=ssl.CERT_REQUIRED, ca_certs=ca_certs) if self.use_ssl else None
self.reader, self.writer = await asyncio.wait_for(self.conn_coro(context), 5)
except TimeoutError:
self.print_error("TimeoutError after getting certificate successfully...")
raise
except BaseException as e:
if self.is_running():
if not isinstance(e, OSError):
traceback.print_exc()
self.print_error("Previous exception will now be reraised")
raise e
if self.use_ssl and is_new:
self.print_error("saving new certificate for", self.host)
os.rename(temporary_path, cert_path)
return self.reader, self.writer, False
async def send_all(self, list_of_requests):
_, w, usedExisting = await self._get_read_write()
starttime = time.time()
for i in list_of_requests:
w.write(json.dumps(i).encode("ascii") + b"\n")
await w.drain()
if time.time() - starttime > 2.5:
self.print_error("send_all: sending is taking too long. Used existing connection: ", usedExisting)
raise ConnectionError("sending is taking too long")
def close(self):
if self.writer:
self.writer.close()
def _try_extract(self):
try:
pos = self.buf.index(b"\n")
except ValueError:
return
obj = self.buf[:pos]
try:
obj = json.loads(obj.decode("ascii"))
except ValueError:
return
else:
self.buf = self.buf[pos+1:]
self.last_action = time.time()
return obj
async def get(self):
reader, _, _ = await self._get_read_write()
while self.is_running():
tried = self._try_extract()
if tried: return tried
temp = io.BytesIO()
try:
data = await asyncio.wait_for(reader.read(2**10), 1)
temp.write(data)
except asyncio.TimeoutError:
continue
self.buf += temp.getvalue()
def idle_time(self):
return time.time() - self.last_action
def diagnostic_name(self):
return self.host
def fileno(self):
# Needed for select
return self.socket.fileno()
def close(self):
if not self.closed_remotely:
try:
self.socket.shutdown(socket.SHUT_RDWR)
except socket.error:
pass
self.socket.close()
def queue_request(self, *args): # method, params, _id
async def queue_request(self, *args): # method, params, _id
'''Queue a request, later to be send with send_requests when the
socket is available for writing.
'''
self.request_time = time.time()
self.unsent_requests.append(args)
await self.unsent_requests.put((self.request_time, args))
def num_requests(self):
'''Keep unanswered requests below 100'''
n = 100 - len(self.unanswered_requests)
return min(n, len(self.unsent_requests))
return min(n, self.unsent_requests.qsize())
def send_requests(self):
async def send_request(self):
'''Sends queued requests. Returns False on failure.'''
make_dict = lambda m, p, i: {'method': m, 'params': p, 'id': i}
n = self.num_requests()
wire_requests = self.unsent_requests[0:n]
try:
self.pipe.send_all([make_dict(*r) for r in wire_requests])
except socket.error as e:
self.print_error("socket error:", e)
prio, request = await asyncio.wait_for(self.unsent_requests.get(), 1.5)
except TimeoutError:
return False
self.unsent_requests = self.unsent_requests[n:]
for request in wire_requests:
if self.debug:
self.print_error("-->", request)
self.unanswered_requests[request[2]] = request
try:
await self.send_all([make_dict(*request)])
except (SocksError, OSError, TimeoutError) as e:
if type(e) is SocksError:
self.print_error(e)
await self.unsent_requests.put((prio, request))
return False
if self.debug:
self.print_error("-->", request)
self.unanswered_requests[request[2]] = request
self.last_action = time.time()
return True
def ping_required(self):
@ -318,13 +303,12 @@ class Interface(util.PrintError):
def has_timed_out(self):
'''Returns True if the interface has timed out.'''
if (self.unanswered_requests and time.time() - self.request_time > 10
and self.pipe.idle_time() > 10):
and self.idle_time() > 10):
self.print_error("timeout", len(self.unanswered_requests))
return True
return False
def get_responses(self):
async def get_response(self):
'''Call if there is data available on the socket. Returns a list of
(request, response) pairs. Notifications are singleton
unsolicited responses presumably as a result of prior
@ -333,34 +317,25 @@ class Interface(util.PrintError):
corresponding request. If the connection was closed remotely
or the remote server is misbehaving, a (None, None) will appear.
'''
responses = []
while True:
try:
response = self.pipe.get()
except util.timeout:
break
if not type(response) is dict:
responses.append((None, None))
if response is None:
self.closed_remotely = True
response = await self.get()
if not type(response) is dict:
if response is None:
self.closed_remotely = True
if self.is_running():
self.print_error("connection closed remotely")
break
if self.debug:
self.print_error("<--", response)
wire_id = response.get('id', None)
if wire_id is None: # Notification
responses.append((None, response))
return None, None
if self.debug:
self.print_error("<--", response)
wire_id = response.get('id', None)
if wire_id is None: # Notification
return None, response
else:
request = self.unanswered_requests.pop(wire_id, None)
if request:
return request, response
else:
request = self.unanswered_requests.pop(wire_id, None)
if request:
responses.append((request, response))
else:
self.print_error("unknown wire ID", wire_id)
responses.append((None, None)) # Signal
break
return responses
self.print_error("unknown wire ID", wire_id)
return None, None # Signal
def check_cert(host, cert):
try:

912
lib/lightning.py Normal file
View File

@ -0,0 +1,912 @@
import functools
import sys
import struct
import traceback
sys.path.insert(0, "lib/ln")
from .ln import rpc_pb2
from jsonrpclib import Server
from google.protobuf import json_format
import binascii
import ecdsa.util
import hashlib
from .bitcoin import EC_KEY, MySigningKey
from ecdsa.curves import SECP256k1
from . import bitcoin
from . import transaction
from . import keystore
import queue
from .util import ForeverCoroutineJob
import threading
import json
import base64
import asyncio
from concurrent.futures import TimeoutError
WALLET = None
NETWORK = None
CONFIG = None
locked = set()
machine = "148.251.87.112"
#machine = "127.0.0.1"
def WriteDb(json):
req = rpc_pb2.WriteDbRequest()
json_format.Parse(json, req)
print("writedb unimplemented", req.dbData)
m = rpc_pb2.WriteDbResponse()
msg = json_format.MessageToJson(m)
return msg
def ConfirmedBalance(json):
request = rpc_pb2.ConfirmedBalanceRequest()
json_format.Parse(json, request)
m = rpc_pb2.ConfirmedBalanceResponse()
confs = request.confirmations
#witness = request.witness # bool
m.amount = sum(WALLET.get_balance())
msg = json_format.MessageToJson(m)
return msg
def NewAddress(json):
request = rpc_pb2.NewAddressRequest()
json_format.Parse(json, request)
m = rpc_pb2.NewAddressResponse()
if request.type == rpc_pb2.WITNESS_PUBKEY_HASH:
m.address = WALLET.get_unused_address()
elif request.type == rpc_pb2.NESTED_PUBKEY_HASH:
assert False, "cannot handle nested-pubkey-hash address type generation yet"
elif request.type == rpc_pb2.PUBKEY_HASH:
assert False, "cannot handle pubkey_hash generation yet"
else:
assert False, "unknown address type"
msg = json_format.MessageToJson(m)
return msg
#def FetchRootKey(json):
# request = rpc_pb2.FetchRootKeyRequest()
# json_format.Parse(json, request)
# m = rpc_pb2.FetchRootKeyResponse()
# m.rootKey = WALLET.keystore.get_private_key([151,151,151,151], None)[0]
# msg = json_format.MessageToJson(m)
# return msg
cl = rpc_pb2.ListUnspentWitnessRequest
assert rpc_pb2.WITNESS_PUBKEY_HASH is not None
def ListUnspentWitness(json):
req = cl()
json_format.Parse(json, req)
confs = req.minConfirmations #TODO regard this
unspent = WALLET.get_utxos()
m = rpc_pb2.ListUnspentWitnessResponse()
for utxo in unspent:
# print(utxo)
# example:
# {'prevout_n': 0,
# 'address': 'sb1qt52ccplvtpehz7qvvqft2udf2eaqvfsal08xre',
# 'prevout_hash': '0d4caccd6e8a906c8ca22badf597c4dedc6dd7839f3cac3137f8f29212099882',
# 'coinbase': False,
# 'height': 326,
# 'value': 400000000}
global locked
if (utxo["prevout_hash"], utxo["prevout_n"]) in locked:
print("SKIPPING LOCKED OUTPOINT", utxo["prevout_hash"])
continue
towire = m.utxos.add()
towire.addressType = rpc_pb2.WITNESS_PUBKEY_HASH
towire.redeemScript = b""
towire.pkScript = b""
towire.witnessScript = bytes(bytearray.fromhex(
bitcoin.address_to_script(utxo["address"])))
towire.value = utxo["value"]
towire.outPoint.hash = utxo["prevout_hash"]
towire.outPoint.index = utxo["prevout_n"]
return json_format.MessageToJson(m)
def LockOutpoint(json):
req = rpc_pb2.LockOutpointRequest()
json_format.Parse(json, req)
global locked
locked.add((req.outpoint.hash, req.outpoint.index))
def UnlockOutpoint(json):
req = rpc_pb2.UnlockOutpointRequest()
json_format.Parse(json, req)
global locked
# throws KeyError if not existing. Use .discard() if we do not care
locked.remove((req.outpoint.hash, req.outpoint.index))
def ListTransactionDetails(json):
global WALLET
global NETWORK
m = rpc_pb2.ListTransactionDetailsResponse()
for tx_hash, height, conf, timestamp, delta, balance in WALLET.get_history():
if height == 0:
print("WARNING", tx_hash, "has zero height!")
detail = m.details.add()
detail.hash = tx_hash
detail.value = delta
detail.numConfirmations = conf
detail.blockHash = NETWORK.blockchain().get_hash(height)
detail.blockHeight = height
detail.timestamp = timestamp
detail.totalFees = 1337 # TODO
return json_format.MessageToJson(m)
def FetchInputInfo(json):
req = rpc_pb2.FetchInputInfoRequest()
json_format.Parse(json, req)
has = req.outPoint.hash
idx = req.outPoint.index
txoinfo = WALLET.txo.get(has, {})
m = rpc_pb2.FetchInputInfoResponse()
if has in WALLET.transactions:
tx = WALLET.transactions[has]
m.mine = True
else:
tx = WALLET.get_input_tx(has)
print("did not find tx with hash", has)
print("tx", tx)
m.mine = False
return json_format.MessageToJson(m)
outputs = tx.outputs()
assert {bitcoin.TYPE_SCRIPT: "SCRIPT", bitcoin.TYPE_ADDRESS: "ADDRESS",
bitcoin.TYPE_PUBKEY: "PUBKEY"}[outputs[idx][0]] == "ADDRESS"
scr = transaction.Transaction.pay_script(outputs[idx][0], outputs[idx][1])
m.txOut.value = outputs[idx][2] # type, addr, val
m.txOut.pkScript = bytes(bytearray.fromhex(scr))
msg = json_format.MessageToJson(m)
return msg
def SendOutputs(json):
global NETWORK, WALLET, CONFIG
req = rpc_pb2.SendOutputsRequest()
json_format.Parse(json, req)
m = rpc_pb2.SendOutputsResponse()
elecOutputs = [(bitcoin.TYPE_SCRIPT, binascii.hexlify(txout.pkScript).decode("utf-8"), txout.value) for txout in req.outputs]
print("ignoring feeSatPerByte", req.feeSatPerByte) # TODO
tx = None
try:
# outputs, password, config, fee
tx = WALLET.mktx(elecOutputs, None, CONFIG, 1000)
except Exception as e:
m.success = False
m.error = str(e)
m.resultHash = ""
return json_format.MessageToJson(m)
suc, has = NETWORK.broadcast(tx)
if not suc:
m.success = False
m.error = "electrum/lightning/SendOutputs: Could not broadcast: " + str(has)
m.resultHash = ""
return json_format.MessageToJson(m)
m.success = True
m.error = ""
m.resultHash = tx.txid()
return json_format.MessageToJson(m)
def isSynced():
global NETWORK
local_height, server_height = NETWORK.get_status_value("updated")
synced = server_height != 0 and NETWORK.is_up_to_date() and local_height >= server_height
return synced, local_height, server_height
def IsSynced(json):
m = rpc_pb2.IsSyncedResponse()
m.synced, localHeight, _ = isSynced()
block = NETWORK.blockchain().read_header(localHeight)
m.lastBlockTimeStamp = block["timestamp"]
return json_format.MessageToJson(m)
def SignMessage(json):
req = rpc_pb2.SignMessageRequest()
json_format.Parse(json, req)
m = rpc_pb2.SignMessageResponse()
pri = privKeyForPubKey(req.pubKey)
m.signature = pri.sign(bitcoin.Hash(req.messageToBeSigned), ecdsa.util.sigencode_der)
m.error = ""
m.success = True
return json_format.MessageToJson(m)
def LEtobytes(x, l):
if l == 2:
fmt = "<H"
elif l == 4:
fmt = "<I"
elif l == 8:
fmt = "<Q"
else:
assert False, "invalid format for LEtobytes"
return struct.pack(fmt, x)
def toint(x):
if len(x) == 1:
return ord(x)
elif len(x) == 2:
fmt = ">H"
elif len(x) == 4:
fmt = ">I"
elif len(x) == 8:
fmt = ">Q"
else:
assert False, "invalid length for toint(): " + str(len(x))
return struct.unpack(fmt, x)[0]
class TxSigHashes(object):
def __init__(self, hashOutputs=None, hashSequence=None, hashPrevOuts=None):
self.hashOutputs = hashOutputs
self.hashSequence = hashSequence
self.hashPrevOuts = hashPrevOuts
class Output(object):
def __init__(self, value=None, pkScript=None):
assert value is not None and pkScript is not None
self.value = value
self.pkScript = pkScript
class InputScript(object):
def __init__(self, scriptSig, witness):
assert witness is None or type(witness[0]) is type(bytes([]))
assert type(scriptSig) is type(bytes([]))
self.scriptSig = scriptSig
self.witness = witness
def tweakPrivKey(basePriv, commitTweak):
tweakInt = int.from_bytes(commitTweak, byteorder="big")
tweakInt += basePriv.secret # D is secret
tweakInt %= SECP256k1.generator.order()
return EC_KEY(tweakInt.to_bytes(32, 'big'))
def singleTweakBytes(commitPoint, basePoint):
m = hashlib.sha256()
m.update(bytearray.fromhex(commitPoint))
m.update(bytearray.fromhex(basePoint))
return m.digest()
def deriveRevocationPrivKey(revokeBasePriv, commitSecret):
revokeTweakBytes = singleTweakBytes(revokeBasePriv.get_public_key(True),
commitSecret.get_public_key(True))
revokeTweakInt = int.from_bytes(revokeTweakBytes, byteorder="big")
commitTweakBytes = singleTweakBytes(commitSecret.get_public_key(True),
revokeBasePriv.get_public_key(True))
commitTweakInt = int.from_bytes(commitTweakBytes, byteorder="big")
revokeHalfPriv = revokeTweakInt * revokeBasePriv.secret # D is secret
commitHalfPriv = commitTweakInt * commitSecret.secret
revocationPriv = revokeHalfPriv + commitHalfPriv
revocationPriv %= SECP256k1.generator.order()
return EC_KEY(revocationPriv.to_bytes(32, byteorder="big"))
def maybeTweakPrivKey(signdesc, pri):
if len(signdesc.singleTweak) > 0:
pri2 = tweakPrivKey(pri, signdesc.singleTweak)
elif len(signdesc.doubleTweak) > 0:
pri2 = deriveRevocationPrivKey(pri, EC_KEY(signdesc.doubleTweak))
else:
pri2 = pri
if pri2 != pri:
have_keys = WALLET.storage.get("lightning_extra_keys", [])
if pri2.secret not in have_keys:
WALLET.storage.put("lightning_extra_keys", have_keys + [pri2.secret])
WALLET.storage.write()
print("saved new tweaked key", pri2.secret)
return pri2
def isWitnessPubKeyHash(script):
if len(script) != 2:
return False
haveop0 = (transaction.opcodes.OP_0 == script[0][0])
haveopdata20 = (20 == script[1][0])
return haveop0 and haveopdata20
#// calcWitnessSignatureHash computes the sighash digest of a transaction's
#// segwit input using the new, optimized digest calculation algorithm defined
#// in BIP0143: https://github.com/bitcoin/bips/blob/master/bip-0143.mediawiki.
#// This function makes use of pre-calculated sighash fragments stored within
#// the passed HashCache to eliminate duplicate hashing computations when
#// calculating the final digest, reducing the complexity from O(N^2) to O(N).
#// Additionally, signatures now cover the input value of the referenced unspent
#// output. This allows offline, or hardware wallets to compute the exact amount
#// being spent, in addition to the final transaction fee. In the case the
#// wallet if fed an invalid input amount, the real sighash will differ causing
#// the produced signature to be invalid.
def calcWitnessSignatureHash(original, sigHashes, hashType, tx, idx, amt):
assert len(original) != 0
decoded = transaction.deserialize(binascii.hexlify(tx).decode("utf-8"))
if idx > len(decoded["inputs"]) - 1:
raise Exception("invalid inputIndex")
txin = decoded["inputs"][idx]
#tohash = transaction.Transaction.serialize_witness(txin)
sigHash = LEtobytes(decoded["version"], 4)
if toint(hashType) & toint(sigHashAnyOneCanPay) == 0:
sigHash += bytes(bytearray.fromhex(sigHashes.hashPrevOuts))[::-1]
else:
sigHash += b"\x00" * 32
if toint(hashType) & toint(sigHashAnyOneCanPay) == 0 and toint(hashType) & toint(sigHashMask) != toint(sigHashSingle) and toint(hashType) & toint(sigHashMask) != toint(sigHashNone):
sigHash += bytes(bytearray.fromhex(sigHashes.hashSequence))[::-1]
else:
sigHash += b"\x00" * 32
sigHash += bytes(bytearray.fromhex(txin["prevout_hash"]))[::-1]
sigHash += LEtobytes(txin["prevout_n"], 4)
# byte 72
subscript = list(transaction.script_GetOp(original))
if isWitnessPubKeyHash(subscript):
sigHash += b"\x19"
sigHash += bytes([transaction.opcodes.OP_DUP])
sigHash += bytes([transaction.opcodes.OP_HASH160])
sigHash += b"\x14" # 20 bytes
assert len(subscript) == 2, subscript
opcode, data, length = subscript[1]
sigHash += data
sigHash += bytes([transaction.opcodes.OP_EQUALVERIFY])
sigHash += bytes([transaction.opcodes.OP_CHECKSIG])
else:
# For p2wsh outputs, and future outputs, the script code is
# the original script, with all code separators removed,
# serialized with a var int length prefix.
assert len(sigHash) == 104, len(sigHash)
sigHash += bytes(bytearray.fromhex(bitcoin.var_int(len(original))))
assert len(sigHash) == 105, len(sigHash)
sigHash += original
sigHash += LEtobytes(amt, 8)
sigHash += LEtobytes(txin["sequence"], 4)
if toint(hashType) & toint(sigHashSingle) != toint(sigHashSingle) and toint(hashType) & toint(sigHashNone) != toint(sigHashNone):
sigHash += bytes(bytearray.fromhex(sigHashes.hashOutputs))[::-1]
elif toint(hashtype) & toint(sigHashMask) == toint(sigHashSingle) and idx < len(decoded["outputs"]):
raise Exception("TODO 1")
else:
raise Exception("TODO 2")
sigHash += LEtobytes(decoded["lockTime"], 4)
sigHash += LEtobytes(toint(hashType), 4)
return transaction.Hash(sigHash)
#// RawTxInWitnessSignature returns the serialized ECDA signature for the input
#// idx of the given transaction, with the hashType appended to it. This
#// function is identical to RawTxInSignature, however the signature generated
#// signs a new sighash digest defined in BIP0143.
# func RawTxInWitnessSignature(tx *MsgTx, sigHashes *TxSigHashes, idx int,
# amt int64, subScript []byte, hashType SigHashType,
# key *btcec.PrivateKey) ([]byte, error) {
def rawTxInWitnessSignature(tx, sigHashes, idx, amt, subscript, hashType, key):
digest = calcWitnessSignatureHash(
subscript, sigHashes, hashType, tx, idx, amt)
return key.sign(digest, sigencode=ecdsa.util.sigencode_der) + hashType
# WitnessSignature creates an input witness stack for tx to spend BTC sent
# from a previous output to the owner of privKey using the p2wkh script
# template. The passed transaction must contain all the inputs and outputs as
# dictated by the passed hashType. The signature generated observes the new
# transaction digest algorithm defined within BIP0143.
def witnessSignature(tx, sigHashes, idx, amt, subscript, hashType, privKey, compress):
sig = rawTxInWitnessSignature(
tx, sigHashes, idx, amt, subscript, hashType, privKey)
pkData = bytes(bytearray.fromhex(
privKey.get_public_key(compressed=compress)))
return sig, pkData
sigHashMask = b"\x1f"
sigHashAll = b"\x01"
sigHashNone = b"\x02"
sigHashSingle = b"\x03"
sigHashAnyOneCanPay = b"\x80"
test = rpc_pb2.ComputeInputScriptResponse()
test.witnessScript.append(b"\x01")
test.witnessScript.append(b"\x02")
def SignOutputRaw(json):
req = rpc_pb2.SignOutputRawRequest()
json_format.Parse(json, req)
#assert len(req.signDesc.pubKey) in [33, 0]
assert len(req.signDesc.doubleTweak) in [32, 0]
assert len(req.signDesc.sigHashes.hashPrevOuts) == 64
assert len(req.signDesc.sigHashes.hashSequence) == 64
assert len(req.signDesc.sigHashes.hashOutputs) == 64
m = rpc_pb2.SignOutputRawResponse()
m.signature = signOutputRaw(req.tx, req.signDesc)
msg = json_format.MessageToJson(m)
return msg
def signOutputRaw(tx, signDesc):
pri = derivePrivKey(signDesc.keyDescriptor)
assert pri is not None
pri2 = maybeTweakPrivKey(signDesc, pri)
sig = rawTxInWitnessSignature(tx, signDesc.sigHashes, signDesc.inputIndex,
signDesc.output.value, signDesc.witnessScript, sigHashAll, pri2)
return sig[:len(sig) - 1]
async def PublishTransaction(json):
req = rpc_pb2.PublishTransactionRequest()
json_format.Parse(json, req)
global NETWORK
tx = transaction.Transaction(binascii.hexlify(req.tx).decode("utf-8"))
suc, has = await NETWORK.broadcast_async(tx)
m = rpc_pb2.PublishTransactionResponse()
m.success = suc
m.error = str(has) if not suc else ""
if m.error:
print("PublishTransaction", m.error)
if "Missing inputs" in m.error:
print("inputs", tx.inputs())
return json_format.MessageToJson(m)
def ComputeInputScript(json):
req = rpc_pb2.ComputeInputScriptRequest()
json_format.Parse(json, req)
#assert len(req.signDesc.pubKey) in [33, 0]
assert len(req.signDesc.doubleTweak) in [32, 0]
assert len(req.signDesc.sigHashes.hashPrevOuts) == 64
assert len(req.signDesc.sigHashes.hashSequence) == 64
assert len(req.signDesc.sigHashes.hashOutputs) == 64
# singleTweak , witnessScript variable length
try:
inpscr = computeInputScript(req.tx, req.signDesc)
except:
print("catched!")
traceback.print_exc()
return None
m = rpc_pb2.ComputeInputScriptResponse()
m.witnessScript.append(inpscr.witness[0])
m.witnessScript.append(inpscr.witness[1])
m.scriptSig = inpscr.scriptSig
msg = json_format.MessageToJson(m)
return msg
def fetchPrivKey(str_address, keyLocatorFamily, keyLocatorIndex):
pri = None
if str_address is not None:
pri, redeem_script = WALLET.export_private_key(str_address, None)
if redeem_script:
print("ignoring redeem script", redeem_script)
typ, pri, compressed = bitcoin.deserialize_privkey(pri)
if keyLocatorFamily == 0 and keyLocatorIndex == 0: return EC_KEY(pri)
ks = keystore.BIP32_KeyStore({})
der = "m/0'/"
xtype = 'p2wpkh'
ks.add_xprv_from_seed(pri, xtype, der)
else:
ks = WALLET.keystore
if keyLocatorFamily != 0 or keyLocatorIndex != 0:
pri = ks.get_private_key([1017, keyLocatorFamily, keyLocatorIndex], password=None)[0]
pri = EC_KEY(pri)
assert pri is not None
return pri
def computeInputScript(tx, signdesc):
typ, str_address = transaction.get_address_from_output_script(
signdesc.output.pkScript)
assert typ != bitcoin.TYPE_SCRIPT
assert len(signdesc.keyDescriptor.pubKey) == 0
pri = fetchPrivKey(str_address, signdesc.keyDescriptor.keyLocator.family, signdesc.keyDescriptor.keyLocator.index)
isNestedWitness = False # because NewAddress only does native addresses
witnessProgram = None
ourScriptSig = None
if isNestedWitness:
pub = pri.get_public_key()
scr = bitcoin.hash_160(pub)
witnessProgram = b"\x00\x14" + scr
# \x14 is OP_20
ourScriptSig = b"\x16\x00\x14" + scr
else:
# TODO TEST
witnessProgram = signdesc.output.pkScript
ourScriptSig = b""
print("set empty ourScriptSig")
print("witnessProgram", witnessProgram)
# If a tweak (single or double) is specified, then we'll need to use
# this tweak to derive the final private key to be used for signing
# this output.
pri2 = maybeTweakPrivKey(signdesc, pri)
#
# Generate a valid witness stack for the input.
# TODO(roasbeef): adhere to passed HashType
witnessScript, pkData = witnessSignature(tx, signdesc.sigHashes,
signdesc.inputIndex, signdesc.output.value, witnessProgram,
sigHashAll, pri2, True)
return InputScript(witness=(witnessScript, pkData), scriptSig=ourScriptSig)
from collections import namedtuple
QueueItem = namedtuple("QueueItem", ["methodName", "args"])
class LightningRPC(ForeverCoroutineJob):
def __init__(self):
super(LightningRPC, self).__init__()
self.queue = queue.Queue()
self.subscribers = []
# overridden
async def run(self, is_running):
print("RPC STARTED")
while is_running():
try:
qitem = self.queue.get(block=False)
except queue.Empty:
await asyncio.sleep(1)
pass
else:
def lightningRpcNetworkRequestThreadTarget(qitem):
applyMethodName = lambda x: functools.partial(x, qitem.methodName)
client = Server("http://" + machine + ":8090")
argumentStrings = [str(x) for x in qitem.args]
lightningSessionKey = base64.b64encode(privateKeyHash[:6]).decode("ascii")
resolvedMethod = getattr(client, qitem.methodName)
try:
result = resolvedMethod(lightningSessionKey, *argumentStrings)
except BaseException as e:
traceback.print_exc()
for i in self.subscribers: applyMethodName(i)(e)
raise
toprint = result
try:
assert result["stderr"] == "" and result["returncode"] == 0, "LightningRPC detected error: " + result["stderr"]
toprint = json.loads(result["stdout"])
for i in self.subscribers: applyMethodName(i)(toprint)
except BaseException as e:
traceback.print_exc()
for i in self.subscribers: applyMethodName(i)(e)
if self.console:
self.console.newResult.emit(json.dumps(toprint, indent=4))
threading.Thread(target=lightningRpcNetworkRequestThreadTarget, args=(qitem, )).start()
def setConsole(self, console):
self.console = console
def subscribe(self, notifyFunction):
self.subscribers.append(notifyFunction)
def clearSubscribers():
self.subscribers = []
def lightningCall(rpc, methodName):
def fun(*args):
rpc.queue.put(QueueItem(methodName, args))
return fun
class LightningUI():
def __init__(self, lightningGetter):
self.rpc = lightningGetter
def __getattr__(self, nam):
synced, local, server = isSynced()
if not synced:
return lambda *args: "Not synced yet: local/server: {}/{}".format(local, server)
return lightningCall(self.rpc(), nam)
privateKeyHash = None
class LightningWorker(ForeverCoroutineJob):
def __init__(self, wallet, network, config):
global privateKeyHash
super(LightningWorker, self).__init__()
self.server = None
self.wallet = wallet
self.network = network
self.config = config
ks = self.wallet().keystore
assert hasattr(ks, "xprv"), "Wallet must have xprv, can't be e.g. imported"
try:
xprv = ks.get_master_private_key(None)
except:
raise BaseException("Could not get master private key, is the wallet password protected?")
xprv, xpub = bitcoin.bip32_private_derivation(xprv, "m/", "m/152/152/152/152")
tupl = bitcoin.deserialize_xprv(xprv)
privKey = tupl[-1]
assert type(privKey) is type(bytes([]))
privateKeyHash = bitcoin.Hash(privKey)
deser = bitcoin.deserialize_xpub(wallet().keystore.xpub)
assert deser[0] == "p2wpkh", deser
self.subscribers = []
async def run(self, is_running):
global WALLET, NETWORK
global CONFIG
wasAlreadyUpToDate = False
while is_running():
WALLET = self.wallet()
NETWORK = self.network()
CONFIG = self.config()
synced, local, server = isSynced()
if not synced:
await asyncio.sleep(5)
continue
else:
if not wasAlreadyUpToDate:
print("UP TO DATE FOR THE FIRST TIME")
print(NETWORK.get_status_value("updated"))
wasAlreadyUpToDate = True
writer = None
try:
reader, writer = await asyncio.wait_for(asyncio.open_connection(machine, 1080), 5)
writer.write(b"MAGIC")
writer.write(privateKeyHash[:6])
await asyncio.wait_for(writer.drain(), 5)
while is_running():
obj = await readJson(reader, is_running)
if not obj: continue
if "id" not in obj:
print("Invoice update?", obj)
for i in self.subscribers: i(obj)
continue
await asyncio.wait_for(readReqAndReply(obj, writer), 10)
except:
traceback.print_exc()
await asyncio.sleep(5)
continue
def subscribe(self, notifyFunction):
self.subscribers.append(functools.partial(notifyFunction, "LightningWorker"))
async def readJson(reader, is_running):
data = b""
while is_running():
newlines = sum(1 if x == b"\n"[0] else 0 for x in data)
if newlines > 1: print("Too many newlines in Electrum/lightning.py!", data)
try:
return json.loads(data)
except ValueError:
if data != b"": print("parse failed, data has", data)
try:
data += await asyncio.wait_for(reader.read(2048), 1)
except TimeoutError:
continue
async def readReqAndReply(obj, writer):
methods = [
# SecretKeyRing
DerivePrivKey,
DeriveNextKey,
DeriveKey,
ScalarMult
# Signer / BlockchainIO
,ConfirmedBalance
,NewAddress
,ListUnspentWitness
,WriteDb
,FetchInputInfo
,ComputeInputScript
,SignOutputRaw
,PublishTransaction
,LockOutpoint
,UnlockOutpoint
,ListTransactionDetails
,SendOutputs
,IsSynced
,SignMessage]
result = None
found = False
try:
for method in methods:
if method.__name__ == obj["method"]:
params = obj["params"][0]
print("calling method", obj["method"], "with", params)
if asyncio.iscoroutinefunction(method):
result = await method(params)
else:
result = method(params)
found = True
break
except BaseException as e:
traceback.print_exc()
print("exception while calling method", obj["method"])
writer.write(json.dumps({"id":obj["id"],"error": {"code": -32002, "message": traceback.format_exc()}}).encode("ascii") + b"\n")
await writer.drain()
else:
if not found:
# TODO assumes obj has id
writer.write(json.dumps({"id":obj["id"],"error": {"code": -32601, "message": "invalid method"}}).encode("ascii") + b"\n")
else:
print("result was", result)
if result is None:
result = "{}"
try:
assert type({}) is type(json.loads(result))
except:
traceback.print_exc()
print("wrong method implementation")
writer.write(json.dumps({"id":obj["id"],"error": {"code": -32000, "message": "wrong return type in electrum-lightning-hub"}}).encode("ascii") + b"\n")
else:
writer.write(json.dumps({"id":obj["id"],"result": result}).encode("ascii") + b"\n")
await writer.drain()
def privKeyForPubKey(pubKey):
global globalIdx
priv_keys = WALLET.storage.get("lightning_extra_keys", [])
for i in priv_keys:
candidate = EC_KEY(i.to_bytes(32, "big"))
if pubkFromECKEY(candidate) == pubKey:
return candidate
attemptKeyIdx = globalIdx - 1
while attemptKeyIdx >= 0:
attemptPrivKey = fetchPrivKey(None, 9000, attemptKeyIdx)
attempt = pubkFromECKEY(attemptPrivKey)
if attempt == pubKey:
return attemptPrivKey
attemptKeyIdx -= 1
adr = bitcoin.pubkey_to_address('p2wpkh', binascii.hexlify(pubKey).decode("utf-8"))
pri, redeem_script = WALLET.export_private_key(adr, None)
if redeem_script:
print("ignoring redeem script", redeem_script)
typ, pri, compressed = bitcoin.deserialize_privkey(pri)
return EC_KEY(pri)
#assert False, "could not find private key for pubkey {} hex={}".format(pubKey, binascii.hexlify(pubKey).decode("ascii"))
def derivePrivKey(keyDesc):
keyDescFam = keyDesc.keyLocator.family
keyDescIdx = keyDesc.keyLocator.index
keyDescPubKey = keyDesc.pubKey
privKey = None
if len(keyDescPubKey) != 0:
return privKeyForPubKey(keyDescPubKey)
return fetchPrivKey(None, keyDescFam, keyDescIdx)
def DerivePrivKey(json):
req = rpc_pb2.DerivePrivKeyRequest()
json_format.Parse(json, req)
m = rpc_pb2.DerivePrivKeyResponse()
m.privKey = derivePrivKey(req.keyDescriptor).secret.to_bytes(32, "big")
msg = json_format.MessageToJson(m)
return msg
globalIdx = 0
def DeriveNextKey(json):
global globalIdx
req = rpc_pb2.DeriveNextKeyRequest()
json_format.Parse(json, req)
family = req.keyFamily
m = rpc_pb2.DeriveNextKeyResponse()
# lnd leaves these unset:
# source: https://github.com/lightningnetwork/lnd/pull/769/files#diff-c954f5135a8995b1a3dfa298101dd0efR160
#m.keyDescriptor.keyLocator.family =
#m.keyDescriptor.keyLocator.index =
m.keyDescriptor.pubKey = pubkFromECKEY(fetchPrivKey(None, 9000, globalIdx))
globalIdx += 1
msg = json_format.MessageToJson(m)
return msg
def DeriveKey(json):
req = rpc_pb2.DeriveKeyRequest()
json_format.Parse(json, req)
family = req.keyLocator.family
idx = req.keyLocator.index
m = rpc_pb2.DeriveKeyResponse()
#lnd sets these to parameter values
m.keyDescriptor.keyLocator.family = family
m.keyDescriptor.keyLocator.index = index
m.keyDescriptor.pubKey = pubkFromECKEY(fetchPrivKey(None, family, index))
msg = json_format.MessageToJson(m)
return msg
#// ScalarMult performs a scalar multiplication (ECDH-like operation) between
#// the target key descriptor and remote public key. The output returned will be
#// the sha256 of the resulting shared point serialized in compressed format. If
#// k is our private key, and P is the public key, we perform the following
#// operation:
#//
#// sx := k*P s := sha256(sx.SerializeCompressed())
def ScalarMult(json):
req = rpc_pb2.ScalarMultRequest()
json_format.Parse(json, req)
privKey = derivePrivKey(req.keyDescriptor)
point = bitcoin.ser_to_point(req.pubKey)
point = point * privKey.secret
c = hashlib.sha256()
c.update(bitcoin.point_to_ser(point, True))
m = rpc_pb2.ScalarMultResponse()
m.hashResult = c.digest()
msg = json_format.MessageToJson(m)
return msg
def pubkFromECKEY(eckey):
return bytes(bytearray.fromhex(eckey.get_public_key(True))) #compressed=True

File diff suppressed because it is too large Load Diff

View File

@ -1,8 +1,6 @@
{
"testnetnode.arihanc.com": {"t":"51001", "s":"51002"},
"testnet1.bauerj.eu": {"t":"51001", "s":"51002"},
"14.3.140.101": {"t":"51001", "s":"51002"},
"testnet.hsmiths.com": {"t":"53011", "s":"53012"},
"electrum.akinbo.org": {"t":"51001", "s":"51002"},
"ELEX05.blackpole.online": {"t":"52011", "s":"52002"}
"electrum.akinbo.org": {"t":"51001", "s":"51002"}
}

85
lib/ssl_in_socks.py Normal file
View File

@ -0,0 +1,85 @@
import traceback
import ssl
from asyncio.sslproto import SSLProtocol
import aiosocks
import asyncio
from . import interface
class AppProto(asyncio.Protocol):
def __init__(self, receivedQueue, connUpLock):
self.buf = bytearray()
self.receivedQueue = receivedQueue
self.connUpLock = connUpLock
def connection_made(self, transport):
self.connUpLock.release()
def data_received(self, data):
self.buf.extend(data)
NEWLINE = b"\n"[0]
for idx, val in enumerate(self.buf):
if NEWLINE == val:
asyncio.ensure_future(self.receivedQueue.put(bytes(self.buf[:idx+1])))
self.buf = self.buf[idx+1:]
def makeProtocolFactory(receivedQueue, connUpLock, ca_certs):
class MySSLProtocol(SSLProtocol):
def __init__(self):
context = interface.get_ssl_context(\
cert_reqs=ssl.CERT_REQUIRED if ca_certs is not None else ssl.CERT_NONE,\
ca_certs=ca_certs)
proto = AppProto(receivedQueue, connUpLock)
super().__init__(asyncio.get_event_loop(), proto, context, None)
return MySSLProtocol
class ReaderEmulator:
def __init__(self, receivedQueue):
self.receivedQueue = receivedQueue
async def read(self, _bufferSize):
return await self.receivedQueue.get()
class WriterEmulator:
def __init__(self, transport):
self.transport = transport
def write(self, data):
self.transport.write(data)
async def drain(self):
pass
def close(self):
self.transport.close()
async def sslInSocksReaderWriter(socksAddr, socksAuth, host, port, ca_certs):
receivedQueue = asyncio.Queue()
connUpLock = asyncio.Lock()
await connUpLock.acquire()
transport, protocol = await aiosocks.create_connection(\
makeProtocolFactory(receivedQueue, connUpLock, ca_certs),\
proxy=socksAddr,\
proxy_auth=socksAuth, dst=(host, port))
await connUpLock.acquire()
return ReaderEmulator(receivedQueue), WriterEmulator(protocol._app_transport)
if __name__ == "__main__":
async def l(fut):
try:
# aiosocks.Socks4Addr("127.0.0.1", 9050), None, "songbird.bauerj.eu", 50002, None)
args = aiosocks.Socks4Addr("127.0.0.1", 9050), None, "electrum.akinbo.org", 51002, None
reader, writer = await sslInSocksReaderWriter(*args)
writer.write(b'{"id":0,"method":"server.version","args":["3.0.2", "1.1"]}\n')
await writer.drain()
print(await reader.read(4096))
writer.write(b'{"id":0,"method":"server.version","args":["3.0.2", "1.1"]}\n')
await writer.drain()
print(await reader.read(4096))
writer.close()
fut.set_result("finished")
except BaseException as e:
fut.set_exception(e)
def f():
loop = asyncio.get_event_loop()
fut = asyncio.Future()
asyncio.ensure_future(l(fut))
loop.run_until_complete(fut)
print(fut.result())
loop.close()
f()

View File

@ -27,10 +27,10 @@ import hashlib
# from .bitcoin import Hash, hash_encode
from .transaction import Transaction
from .util import ThreadJob, bh2u
from .util import CoroutineJob, bh2u
class Synchronizer(ThreadJob):
class Synchronizer(CoroutineJob):
'''The synchronizer keeps the wallet up-to-date with its set of
addresses and their transactions. It subscribes over the network
to wallet addresses, gets the wallet to generate new addresses
@ -178,7 +178,7 @@ class Synchronizer(ThreadJob):
self.print_error("missing tx", self.requested_tx)
self.subscribe_to_addresses(set(self.wallet.get_addresses()))
def run(self):
async def run(self):
'''Called from the network proxy thread main loop.'''
# 1. Create new addresses
self.wallet.synchronize()

View File

@ -526,7 +526,10 @@ def deserialize(raw):
txin['address'] = bitcoin.public_key_to_p2wpkh(bfh(txin['pubkeys'][0]))
else:
txin['type'] = 'p2wsh'
txin['address'] = bitcoin.script_to_p2wsh(txin['witnessScript'])
try:
txin['address'] = bitcoin.script_to_p2wsh(txin['witnessScript'])
except KeyError:
pass
d['lockTime'] = vds.read_uint32()
return d

View File

@ -28,14 +28,16 @@ from decimal import Decimal
import traceback
import urllib
import threading
import time
import json
import urllib.request, urllib.parse, urllib.error
import queue
import asyncio
import hmac
from .i18n import _
import urllib.request, urllib.parse, urllib.error
import queue
def inv_dict(d):
return {v: k for k, v in d.items()}
@ -86,6 +88,24 @@ class PrintError(object):
def print_msg(self, *msg):
print_msg("[%s]" % self.diagnostic_name(), *msg)
class ForeverCoroutineJob(PrintError):
"""A job that is run from a thread's main loop. run() is
called from that thread's context.
"""
async def run(self, is_running):
"""Called once from the thread"""
pass
class CoroutineJob(PrintError):
"""A job that is run periodically from a thread's main loop. run() is
called from that thread's context.
"""
async def run(self):
"""Called periodically from the thread"""
pass
class ThreadJob(PrintError):
"""A job that is run periodically from a thread's main loop. run() is
called from that thread's context.
@ -130,6 +150,38 @@ class DaemonThread(threading.Thread, PrintError):
self.running_lock = threading.Lock()
self.job_lock = threading.Lock()
self.jobs = []
self.coroutines = []
self.forever_coroutines_task = None
def add_coroutines(self, jobs):
for i in jobs: assert isinstance(i, CoroutineJob), i.__class__.__name__ + " does not inherit from CoroutineJob"
self.coroutines.extend(jobs)
def set_forever_coroutines(self, jobs):
for i in jobs: assert isinstance(i, ForeverCoroutineJob), i.__class__.__name__ + " does not inherit from ForeverCoroutineJob"
async def put():
await self.forever_coroutines_queue.put(jobs)
asyncio.run_coroutine_threadsafe(put(), self.loop)
def run_forever_coroutines(self):
self.forever_coroutines_queue = asyncio.Queue() # making queue here because __init__ is called from non-network thread
self.loop = asyncio.get_event_loop()
async def getFromQueueAndStart():
jobs = await self.forever_coroutines_queue.get()
await asyncio.gather(*[i.run(self.is_running) for i in jobs])
self.print_error("FOREVER JOBS DONE")
self.forever_coroutines_task = asyncio.ensure_future(getFromQueueAndStart())
return self.forever_coroutines_task
async def run_coroutines(self):
for coroutine in self.coroutines:
assert isinstance(coroutine, CoroutineJob)
await coroutine.run()
def remove_coroutines(self, jobs):
for i in jobs: assert isinstance(i, CoroutineJob)
for job in jobs:
self.coroutines.remove(job)
def add_jobs(self, jobs):
with self.job_lock:
@ -141,6 +193,7 @@ class DaemonThread(threading.Thread, PrintError):
# malformed or malicious server responses
with self.job_lock:
for job in self.jobs:
assert isinstance(job, ThreadJob)
try:
job.run()
except Exception as e:
@ -600,112 +653,8 @@ def parse_json(message):
class timeout(Exception):
pass
import socket
import json
import ssl
import time
class SocketPipe:
def __init__(self, socket):
self.socket = socket
self.message = b''
self.set_timeout(0.1)
self.recv_time = time.time()
def set_timeout(self, t):
self.socket.settimeout(t)
def idle_time(self):
return time.time() - self.recv_time
def get(self):
while True:
response, self.message = parse_json(self.message)
if response is not None:
return response
try:
data = self.socket.recv(1024)
except socket.timeout:
raise timeout
except ssl.SSLError:
raise timeout
except socket.error as err:
if err.errno == 60:
raise timeout
elif err.errno in [11, 35, 10035]:
print_error("socket errno %d (resource temporarily unavailable)"% err.errno)
time.sleep(0.2)
raise timeout
else:
print_error("pipe: socket error", err)
data = b''
except:
traceback.print_exc(file=sys.stderr)
data = b''
if not data: # Connection closed remotely
return None
self.message += data
self.recv_time = time.time()
def send(self, request):
out = json.dumps(request) + '\n'
out = out.encode('utf8')
self._send(out)
def send_all(self, requests):
out = b''.join(map(lambda x: (json.dumps(x) + '\n').encode('utf8'), requests))
self._send(out)
def _send(self, out):
while out:
try:
sent = self.socket.send(out)
out = out[sent:]
except ssl.SSLError as e:
print_error("SSLError:", e)
time.sleep(0.1)
continue
except OSError as e:
print_error("OSError", e)
time.sleep(0.1)
continue
class QueuePipe:
def __init__(self, send_queue=None, get_queue=None):
self.send_queue = send_queue if send_queue else queue.Queue()
self.get_queue = get_queue if get_queue else queue.Queue()
self.set_timeout(0.1)
def get(self):
try:
return self.get_queue.get(timeout=self.timeout)
except queue.Empty:
raise timeout
def get_all(self):
responses = []
while True:
try:
r = self.get_queue.get_nowait()
responses.append(r)
except queue.Empty:
break
return responses
def set_timeout(self, t):
self.timeout = t
def send(self, request):
self.send_queue.put(request)
def send_all(self, requests):
for request in requests:
self.send(request)
@ -732,4 +681,4 @@ def setup_thread_excepthook():
self.run = run_with_except_hook
threading.Thread.__init__ = init
threading.Thread.__init__ = init

View File

@ -20,11 +20,11 @@
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
from .util import ThreadJob
from .util import CoroutineJob
from .bitcoin import *
class SPV(ThreadJob):
class SPV(CoroutineJob):
""" Simple Payment Verification """
def __init__(self, network, wallet):
@ -35,7 +35,7 @@ class SPV(ThreadJob):
# requested, and the merkle root once it has been verified
self.merkle_roots = {}
def run(self):
async def run(self):
lh = self.network.get_local_height()
unverified = self.wallet.get_unverified_txs()
for tx_hash, tx_height in unverified.items():

View File

@ -56,6 +56,7 @@ from .plugins import run_hook
from . import bitcoin
from . import coinchooser
from .synchronizer import Synchronizer
from .lightning import LightningRPC
from .verifier import SPV
from . import paymentrequest
@ -63,6 +64,8 @@ from .paymentrequest import PR_PAID, PR_UNPAID, PR_UNKNOWN, PR_EXPIRED
from .paymentrequest import InvoiceStore
from .contacts import Contacts
from .lightning import LightningWorker
TX_STATUS = [
_('Replaceable'),
_('Unconfirmed parent'),
@ -1003,14 +1006,17 @@ class Abstract_Wallet(PrintError):
self.prepare_for_verifier()
self.verifier = SPV(self.network, self)
self.synchronizer = Synchronizer(self, network)
network.add_jobs([self.verifier, self.synchronizer])
network.add_coroutines([self.verifier, self.synchronizer])
self.lightning = LightningRPC()
self.lightningworker = LightningWorker(lambda: self, lambda: network, lambda: network.config)
network.set_forever_coroutines([self.lightning, self.lightningworker])
else:
self.verifier = None
self.synchronizer = None
def stop_threads(self):
if self.network:
self.network.remove_jobs([self.synchronizer, self.verifier])
self.network.remove_coroutines([self.synchronizer, self.verifier])
self.synchronizer.release()
self.synchronizer = None
self.verifier = None
@ -1888,7 +1894,6 @@ class Multisig_Wallet(Deterministic_Wallet):
txin['signatures'] = [None] * self.n
txin['num_sig'] = self.m
wallet_types = ['standard', 'multisig', 'imported']
def register_wallet_type(category):

BIN
perf.data Normal file

Binary file not shown.

15
protoc_lightning.sh Executable file
View File

@ -0,0 +1,15 @@
#!/bin/sh -ex
if [ ! -d $HOME/go/src/github.com/grpc-ecosystem ]; then
# from readme in https://github.com/grpc-ecosystem/grpc-gateway
go get -u github.com/grpc-ecosystem/grpc-gateway/protoc-gen-grpc-gateway
go get -u github.com/grpc-ecosystem/grpc-gateway/protoc-gen-swagger
go get -u github.com/golang/protobuf/protoc-gen-go
fi
if [ ! -d $HOME/go/src/github.com/lightningnetwork/lnd ]; then
echo "You need an lnd with electrum-bridge (ysangkok/lnd maybe?) checked out since we implement the interface from there, and need it to generate code"
exit 1
fi
mkdir -p lib/ln || true
touch lib/__init__.py
~/go/bin/protoc -I$HOME/include -I$HOME/go/src/github.com/grpc-ecosystem/grpc-gateway/third_party/googleapis --python_out=lib/ln $HOME/go/src/github.com/grpc-ecosystem/grpc-gateway/third_party/googleapis/google/api/*.proto
python3 -m grpc_tools.protoc -I $HOME/go/src/github.com/grpc-ecosystem/grpc-gateway/third_party/googleapis --proto_path $HOME/go/src/github.com/lightningnetwork/lnd/electrum-bridge --python_out=lib/ln --grpc_python_out=lib/ln ~/go/src/github.com/lightningnetwork/lnd/electrum-bridge/rpc.proto

View File

@ -44,7 +44,7 @@ setup(
'protobuf',
'dnspython',
'jsonrpclib-pelix',
'PySocks>=1.6.6',
'aiosocks>=0.2.6',
],
packages=[
'electrum',

21
testserver.py Normal file
View File

@ -0,0 +1,21 @@
import asyncio
async def handler(reader, writer):
magic = await reader.read(5+6)
await asyncio.sleep(5)
print("in five sec!")
await asyncio.sleep(5)
writer.write(b'{\n "r_preimage": "6UNoNhDZ/0awtaDTM7KuCtlYcNkNljscxMLleoJv9+o=",\n "r_hash": "lQDtsJlLe8IzSRk0hrJcgglwRdtkHzX6mIwOhJrN7Ck=",\n "value": "8192",\n "settled": true,\n "creation_date": "1519994196",\n "settle_date": "1519994199",\n "payment_request": "lntb81920n1pdfj325pp5k7erq3avatceq8ca43h5uulxrhw2ma3a442a7c8fxrsw059c3m3sdqqcqzysdpwv4dn2xd74lfmea3taxj6pjfxrdl42t8w7ceptgv5ds0td0ypk47llryl6t4a48x54d7mnwremgcmljced4dhwty9g3pfywr307aqpwtkzf4",\n "expiry": "3600",\n "cltv_expiry": "144"\n}\n'.replace(b"\n",b""))
await writer.drain()
print(magic)
async def handler2(reader, writer):
while True:
data = await reader.read(2048)
if data != b'':
writer.write(b"HTTP/1.0 200 OK\r\nContent-length: 16\r\n\r\n{\"result\":\"lol\"}")
await writer.drain()
asyncio.ensure_future(asyncio.start_server(handler, "127.0.0.1", 1080))
asyncio.ensure_future(asyncio.start_server(handler2, "127.0.0.1", 8090))
asyncio.get_event_loop().run_forever()