diff --git a/CHANGELOG.md b/CHANGELOG.md index 2f1ec14fc..d64456bfd 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,6 +1,7 @@ -# 0.7.0-rc.18 (Synonym Fork) +# 0.7.0-rc.19 (Synonym Fork) ## Bug Fixes + - Backported upstream Electrum sync fix (PR #4341): Skip unconfirmed `get_history` entries in `ElectrumSyncClient`. Previously, mempool entries (height=0 or -1) were incorrectly treated as confirmed, causing `get_merkle` to fail for 0-conf channel funding transactions. @@ -8,6 +9,16 @@ emitted when LDK replays events after node restart. ## Synonym Fork Additions + +- Added multi-address type support for on-chain wallet: + - `AddressType` enum with Legacy, NestedSegwit, NativeSegwit, and Taproot variants + - `NodeBuilder::set_address_type()` to configure primary address type for new addresses + - `NodeBuilder::set_address_types_to_monitor()` to scan additional address types for existing funds + - `Node::list_monitored_address_types()` to get currently loaded wallet address types + - `Node::get_balance_for_address_type()` returns `AddressTypeBalance` with total and spendable sats + - `OnchainPayment::new_address_for_type()` to generate address for a specific type + - Per-wallet incremental sync with individual timestamps for optimal performance + - Cross-wallet UTXO aggregation for spending from multiple address types in one transaction - Upgraded to Kotlin 2.2.0 for compatibility with consuming apps using Kotlin 2.x - Added JitPack support for `ldk-node-jvm` module to enable unit testing in consuming apps - Added runtime-adjustable wallet sync intervals for battery optimization on mobile: @@ -88,11 +99,13 @@ would use internally. # 0.7.0 - Dec. 3, 2025 + This seventh minor release introduces numerous new features, bug fixes, and API improvements. In particular, it adds support for channel Splicing, Async Payments, as well as sourcing chain data from a Bitcoin Core REST backend. ## Feature and API updates + - Experimental support for channel splicing has been added. (#677) - - **Note**: Splicing-related transactions might currently still get misclassified in the payment store. + - **Note**: Splicing-related transactions might currently still get misclassified in the payment store. - Support for serving and paying static invoices for Async Payments has been added. (#621, #632) - Sourcing chain data via Bitcoin Core's REST interface is now supported. (#526) - A new `Builder::set_chain_source_esplora_with_headers` method has been added @@ -111,6 +124,7 @@ This seventh minor release introduces numerous new features, bug fixes, and API - The `generate_entropy_mnemonic` method now supports specifying a word count. (#699) ## Bug Fixes and Improvements + - Robustness of the shutdown procedure has been improved, minimizing risk of blocking during `Node::stop`. (#592, #612, #619, #622) - The VSS storage backend now supports 'lazy' deletes, allowing it to avoid unnecessarily waiting on remote calls for certain operations. (#689, #722) @@ -127,6 +141,7 @@ This seventh minor release introduces numerous new features, bug fixes, and API - The node now listens on all provided listening addresses. (#644) ## Compatibility Notes + - The minimum supported Rust version (MSRV) has been bumped to `rustc` v1.85 (#606) - The LDK dependency has been bumped to v0.2. - The BDK dependency has been bumped to v2.2. (#656) @@ -155,11 +170,13 @@ deletions in 264 commits from 14 authors in alphabetical order: - tosynthegeek # 0.6.2 - Aug. 14, 2025 + This patch release fixes a panic that could have been hit when syncing to a TLS-enabled Electrum server, as well as some minor issues when shutting down the node. ## Bug Fixes and Improvements + - If not set by the user, we now install a default `CryptoProvider` for the `rustls` TLS library. This fixes an issue that would have the node panic whenever they first try to access an Electrum server behind an `ssl://` @@ -176,15 +193,18 @@ deletions in 13 commits from 2 authors in alphabetical order: - moisesPomilio # 0.6.1 - Jun. 19, 2025 + This patch release fixes minor issues with the recently-exposed `Bolt11Invoice` type in bindings. ## Feature and API updates + - The `Bolt11Invoice::description` method is now exposed as `Bolt11Invoice::invoice_description` in bindings, to avoid collisions with a Swift standard method of same name (#576) ## Bug Fixes and Improvements + - The `Display` implementation of `Bolt11Invoice` is now exposed in bindings, (re-)allowing to render the invoice as a string. (#574) @@ -194,16 +214,19 @@ in 8 commits from 1 author in alphabetical order: - Elias Rohrer # 0.6.0 - Jun. 9, 2025 + This sixth minor release mainly fixes an issue that could have left the on-chain wallet unable to spend funds if transactions that had previously been accepted to the mempool ended up being evicted. ## Feature and API updates + - Onchain addresses are now validated against the expected network before use (#519). - The API methods on the `Bolt11Invoice` type are now exposed in bindings (#522). - The `UnifiedQrPayment::receive` flow no longer aborts if we're unable to generate a BOLT12 offer (#548). ## Bug Fixes and Improvements + - Previously, the node could potentially enter a state that would have left the onchain wallet unable spend any funds if previously-generated transactions had been first accepted, and then evicted from the mempool. This has been @@ -213,6 +236,7 @@ accepted to the mempool ended up being evicted. - The output of the `log` facade logger has been corrected (#547). ## Compatibility Notes + - The BDK dependency has been bumped to `bdk_wallet` v2.0 (#551). In total, this release features 20 files changed, 1188 insertions, 447 deletions, in 18 commits from 3 authors in alphabetical order: @@ -222,9 +246,11 @@ In total, this release features 20 files changed, 1188 insertions, 447 deletions - Elias Rohrer # 0.5.0 - Apr. 29, 2025 + Besides numerous API improvements and bugfixes this fifth minor release notably adds support for sourcing chain and fee rate data from an Electrum backend, requesting channels via the [bLIP-51 / LSPS1](https://github.com/lightning/blips/blob/master/blip-0051.md) protocol, as well as experimental support for operating as a [bLIP-52 / LSPS2](https://github.com/lightning/blips/blob/master/blip-0052.md) service. ## Feature and API updates + - The `PaymentSuccessful` event now exposes a `payment_preimage` field (#392). - The node now emits `PaymentForwarded` events for forwarded payments (#404). - The ability to send custom TLVs as part of spontaneous payments has been added (#411). @@ -238,7 +264,7 @@ Besides numerous API improvements and bugfixes this fifth minor release notably - On-chain transactions are now added to the internal payment store and exposed via `Node::list_payments` (#432). - Inbound announced channels are now rejected if not all requirements for operating as a forwarding node (set listening addresses and node alias) have been met (#467). - Initial support for operating as an bLIP-52 / LSPS2 service has been added (#420). - - **Note**: bLIP-52 / LSPS2 support is considered 'alpha'/'experimental' and should *not* yet be used in production. + - **Note**: bLIP-52 / LSPS2 support is considered 'alpha'/'experimental' and should _not_ yet be used in production. - The `Builder::set_entropy_seed_bytes` method now takes an array rather than a `Vec` (#493). - The builder will now return a `NetworkMismatch` error in case of network switching (#485). - The `Bolt11Jit` payment variant now exposes a field telling how much fee the LSP withheld (#497). @@ -248,6 +274,7 @@ Besides numerous API improvements and bugfixes this fifth minor release notably - The ability to sync the node via an Electrum backend has been added (#486). ## Bug Fixes and Improvements + - When syncing from Bitcoin Core RPC, syncing mempool entries has been made more efficient (#410, #465). - We now ensure the our configured fallback rates are used when the configured chain source would return huge bogus values during fee estimation (#430). - We now re-enabled trying to bump Anchor channel transactions for trusted counterparties in the `ContentiousClaimable` case to reduce the risk of losing funds in certain edge cases (#461). @@ -255,6 +282,7 @@ Besides numerous API improvements and bugfixes this fifth minor release notably - The `Node::remove_payment` now also removes the respective entry from the in-memory state, not only from the persisted payment store (#514). ## Compatibility Notes + - The filesystem logger was simplified and its default path changed to `ldk_node.log` in the configured storage directory (#394). - The BDK dependency has been bumped to `bdk_wallet` v1.0 (#426). - The LDK dependency has been bumped to `lightning` v0.1 (#426). @@ -295,7 +323,6 @@ In total, this release features 1 files changed, 40 insertions, 4 deletions in 3 - Fuyin - Elias Rohrer - # 0.4.1 - Oct 18, 2024 This patch release fixes a wallet syncing issue where full syncs were used instead of incremental syncs, and vice versa (#383). @@ -311,10 +338,11 @@ In total, this release features 3 files changed, 13 insertions, 9 deletions in 6 Besides numerous API improvements and bugfixes this fourth minor release notably adds support for sourcing chain and fee rate data from a Bitcoin Core RPC backend, as well as experimental support for the [VSS] remote storage backend. ## Feature and API updates + - Support for multiple chain sources has been added. To this end, Esplora-specific configuration options can now be given via `EsploraSyncConfig` to `Builder::set_chain_source_esplora`. Furthermore, all configuration objects (including the main `Config`) is now exposed via the `config` sub-module (#365). - Support for sourcing chain and fee estimation data from a Bitcoin Core RPC backed has been added (#370). - Initial experimental support for an encrypted [VSS] remote storage backend has been added (#369, #376, #378). - - **Caution**: VSS support is in **alpha** and is considered experimental. Using VSS (or any remote persistence) may cause LDK to panic if persistence failures are unrecoverable, i.e., if they remain unresolved after internal retries are exhausted. + - **Caution**: VSS support is in **alpha** and is considered experimental. Using VSS (or any remote persistence) may cause LDK to panic if persistence failures are unrecoverable, i.e., if they remain unresolved after internal retries are exhausted. - Support for setting the `NodeAlias` in public node announcements as been added. We now ensure that announced channels can only be opened and accepted when the required configuration options to operate as a public forwarding node are set (listening addresses and node alias). As part of this `Node::connect_open_channel` was split into `open_channel` and `open_announced_channel` API methods. (#330, #366). - The `Node` can now be started via a new `Node::start_with_runtime` call that allows to reuse an outer `tokio` runtime context, avoiding runtime stacking when run in `async` environments (#319). - Support for generating and paying unified QR codes has been added (#302). @@ -322,16 +350,19 @@ Besides numerous API improvements and bugfixes this fourth minor release notably - Support for setting additional parameters when sending BOLT11 payments has been added (#336, #351). ## Bug Fixes + - The `ChannelConfig` object has been refactored, now allowing to query the currently applied `MaxDustHTLCExposure` limit (#350). - A bug potentially leading to panicking on shutdown when stacking `tokio` runtime contexts has been fixed (#373). - We now no longer panic when hitting a persistence failure during event handling. Instead, events will be replayed until successful (#374). -, + , + ## Compatibility Notes + - The LDK dependency has been updated to version 0.0.125 (#358, #375). - The BDK dependency has been updated to version 1.0-beta.4 (#358). - - Going forward, the BDK state will be persisted in the configured `KVStore` backend. - - **Note**: The old descriptor state will *not* be automatically migrated on upgrade, potentially leading to address reuse. Privacy-concious users might want to manually advance the descriptor by requesting new addresses until it reaches the previously observed height. - - After the node as been successfully upgraded users may safely delete `bdk_wallet_*.sqlite` from the storage path. + - Going forward, the BDK state will be persisted in the configured `KVStore` backend. + - **Note**: The old descriptor state will _not_ be automatically migrated on upgrade, potentially leading to address reuse. Privacy-concious users might want to manually advance the descriptor by requesting new addresses until it reaches the previously observed height. + - After the node as been successfully upgraded users may safely delete `bdk_wallet_*.sqlite` from the storage path. - The `rust-bitcoin` dependency has been updated to version 0.32.2 (#358). - The UniFFI dependency has been updated to version 0.27.3 (#379). - The `bip21` dependency has been updated to version 0.5 (#358). @@ -354,6 +385,7 @@ This third minor release notably adds support for BOLT12 payments, Anchor channels, and sourcing inbound liquidity via LSPS2 just-in-time channels. ## Feature and API updates + - Support for creating and paying BOLT12 offers and refunds has been added (#265). - Support for Anchor channels has been added (#141). - Support for sourcing inbound liquidity via LSPS2 just-in-time (JIT) channels has been added (#223). @@ -371,6 +403,7 @@ channels, and sourcing inbound liquidity via LSPS2 just-in-time channels. - The ability to register and claim from custom payment hashes generated outside of LDK Node has been added (#308). ## Bug Fixes + - Node announcements are now correctly only broadcast if we have any public, sufficiently confirmed channels (#248, #314). - Falling back to default fee values is now disallowed on mainnet, ensuring we won't startup without a successful fee cache update (#249). - Persisted peers are now correctly reconnected after startup (#265). @@ -378,6 +411,7 @@ channels, and sourcing inbound liquidity via LSPS2 just-in-time channels. - Several steps have been taken to reduce the risk of blocking node operation on wallet syncing in the face of unresponsive Esplora services (#281). ## Compatibility Notes + - LDK has been updated to version 0.0.123 (#291). In total, this release features 54 files changed, 7282 insertions, 2410 deletions in 165 commits from 3 authors, in alphabetical order: @@ -406,9 +440,11 @@ This is a bugfix release bumping the used LDK and BDK dependencies to the latest stable versions. ## Bug Fixes + - Swift bindings now can be built on macOS again. ## Compatibility Notes + - LDK has been updated to version 0.0.121 (#214, #229) - BDK has been updated to version 0.29.0 (#229) @@ -422,6 +458,7 @@ deletions in 26 commits from 3 authors, in alphabetical order: # 0.2.0 - Dec 13, 2023 ## Feature and API updates + - The capability to send pre-flight probes has been added (#147). - Pre-flight probes will skip outbound channels based on the liquidity available (#156). - Additional fields are now exposed via `ChannelDetails` (#165). @@ -432,10 +469,12 @@ deletions in 26 commits from 3 authors, in alphabetical order: - A module persisting, sweeping, and rebroadcasting output spends has been added (#152). ## Bug Fixes + - No errors are logged anymore when we choose to omit spending of `StaticOutput`s (#137). - An inconsistent state of the log file symlink no longer results in an error during startup (#153). ## Compatibility Notes + - Our currently supported minimum Rust version (MSRV) is 1.63.0. - The Rust crate edition has been bumped to 2021. - Building on Windows is now supported (#160). @@ -454,6 +493,7 @@ In total, this release features 57 files changed, 7369 insertions, 1738 deletion - Orbital # 0.1.0 - Jun 22, 2023 + This is the first non-experimental release of LDK Node. - Log files are now split based on the start date of the node (#116). @@ -467,15 +507,18 @@ This is the first non-experimental release of LDK Node. - The API has been updated to be more aligned between Rust and bindings (#114). ## Compatibility Notes + - Our currently supported minimum Rust version (MSRV) is 1.60.0. - The superfluous `SendingFailed` payment status has been removed, breaking serialization compatibility with alpha releases (#125). - The serialization formats of `PaymentDetails` and `Event` types have been updated, ensuring users upgrading from an alpha release fail to start rather than continuing operating with bogus data. Alpha users should wipe their persisted payment metadata (`payments/*`) and event queue (`events`) after the update (#130). In total, this release includes changes in 52 commits from 2 authors: + - Elias Rohrer - Richard Ulrich # 0.1-alpha.1 - Jun 6, 2023 + - Generation of Swift, Kotlin (JVM and Android), and Python bindings is now supported through UniFFI (#25). - Lists of connected peers and channels may now be retrieved in bindings (#56). - Gossip data may now be sourced from the P2P network, or a Rapid Gossip Sync server (#70). @@ -490,8 +533,8 @@ In total, this release includes changes in 52 commits from 2 authors: - The wallet sync intervals are now configurable (#102). - Granularity of logging can now be configured (#108). - In total, this release includes changes in 64 commits from 4 authors: + - Steve Myers - Elias Rohrer - Jurvis Tan @@ -501,6 +544,7 @@ In total, this release includes changes in 64 commits from 4 authors: production, and no compatibility guarantees are given until the release of 0.1. # 0.1-alpha - Apr 27, 2023 + This is the first alpha release of LDK Node. It features support for sourcing chain data via an Esplora server, file system persistence, gossip sourcing via the Lightning peer-to-peer network, and configurable entropy sources for the diff --git a/Cargo.toml b/Cargo.toml index 76c8fda7a..fd6d8bad4 100755 --- a/Cargo.toml +++ b/Cargo.toml @@ -1,6 +1,6 @@ [package] name = "ldk-node" -version = "0.7.0-rc.18" +version = "0.7.0-rc.19" authors = ["Elias Rohrer "] homepage = "https://lightningdevkit.org/" license = "MIT OR Apache-2.0" diff --git a/Package.swift b/Package.swift index ab06e22ba..a05a9dfca 100644 --- a/Package.swift +++ b/Package.swift @@ -3,8 +3,8 @@ import PackageDescription -let tag = "v0.7.0-rc.18" -let checksum = "05903150276c3c31b2552b89d3781157ac1bbf55a10598655897abd9fe936b6c" +let tag = "v0.7.0-rc.19" +let checksum = "39cb32343f7a4f6c42f02d9e132971df36e7bc7618ad69fb390fdd8c3c223647" let url = "https://github.com/synonymdev/ldk-node/releases/download/\(tag)/LDKNodeFFI.xcframework.zip" let package = Package( diff --git a/bindings/kotlin/ldk-node-android/gradle.properties b/bindings/kotlin/ldk-node-android/gradle.properties index 2b0abdb57..d3840e296 100644 --- a/bindings/kotlin/ldk-node-android/gradle.properties +++ b/bindings/kotlin/ldk-node-android/gradle.properties @@ -2,4 +2,4 @@ org.gradle.jvmargs=-Xmx1536m android.useAndroidX=true android.enableJetifier=true kotlin.code.style=official -libraryVersion=0.7.0-rc.18 +libraryVersion=0.7.0-rc.19 diff --git a/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/arm64-v8a/libldk_node.so b/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/arm64-v8a/libldk_node.so index 1798f33a0..acd819fe8 100755 Binary files a/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/arm64-v8a/libldk_node.so and b/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/arm64-v8a/libldk_node.so differ diff --git a/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/armeabi-v7a/libldk_node.so b/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/armeabi-v7a/libldk_node.so index 45afa3297..298f228f3 100755 Binary files a/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/armeabi-v7a/libldk_node.so and b/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/armeabi-v7a/libldk_node.so differ diff --git a/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/x86_64/libldk_node.so b/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/x86_64/libldk_node.so index b75d024e8..2651f747d 100755 Binary files a/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/x86_64/libldk_node.so and b/bindings/kotlin/ldk-node-android/lib/src/main/jniLibs/x86_64/libldk_node.so differ diff --git a/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.android.kt b/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.android.kt index 9aca7ae51..d97998ebf 100644 --- a/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.android.kt +++ b/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.android.kt @@ -1487,6 +1487,16 @@ internal typealias UniffiVTableCallbackInterfaceVssHeaderProviderUniffiByValue = + + + + + + + + + + @@ -1949,6 +1959,16 @@ internal interface UniffiLib : Library { `headerProvider`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, ): Pointer? + fun uniffi_ldk_node_fn_method_builder_set_address_type( + `ptr`: Pointer?, + `addressType`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): Unit + fun uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor( + `ptr`: Pointer?, + `addressTypesToMonitor`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): Unit fun uniffi_ldk_node_fn_method_builder_set_announcement_addresses( `ptr`: Pointer?, `announcementAddresses`: RustBufferByValue, @@ -2230,6 +2250,11 @@ internal interface UniffiLib : Library { `addressStr`: RustBufferByValue, uniffiCallStatus: UniffiRustCallStatus, ): Long + fun uniffi_ldk_node_fn_method_node_get_balance_for_address_type( + `ptr`: Pointer?, + `addressType`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): RustBufferByValue fun uniffi_ldk_node_fn_method_node_get_transaction_details( `ptr`: Pointer?, `txid`: RustBufferByValue, @@ -2243,6 +2268,10 @@ internal interface UniffiLib : Library { `ptr`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, ): RustBufferByValue + fun uniffi_ldk_node_fn_method_node_list_monitored_address_types( + `ptr`: Pointer?, + uniffiCallStatus: UniffiRustCallStatus, + ): RustBufferByValue fun uniffi_ldk_node_fn_method_node_list_payments( `ptr`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, @@ -2500,6 +2529,11 @@ internal interface UniffiLib : Library { `ptr`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, ): RustBufferByValue + fun uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type( + `ptr`: Pointer?, + `addressType`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): RustBufferByValue fun uniffi_ldk_node_fn_method_onchainpayment_select_utxos_with_algorithm( `ptr`: Pointer?, `targetAmountSats`: Long, @@ -3039,6 +3073,10 @@ internal interface UniffiLib : Library { ): Short fun uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider( ): Short + fun uniffi_ldk_node_checksum_method_builder_set_address_type( + ): Short + fun uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor( + ): Short fun uniffi_ldk_node_checksum_method_builder_set_announcement_addresses( ): Short fun uniffi_ldk_node_checksum_method_builder_set_async_payments_role( @@ -3127,12 +3165,16 @@ internal interface UniffiLib : Library { ): Short fun uniffi_ldk_node_checksum_method_node_get_address_balance( ): Short + fun uniffi_ldk_node_checksum_method_node_get_balance_for_address_type( + ): Short fun uniffi_ldk_node_checksum_method_node_get_transaction_details( ): Short fun uniffi_ldk_node_checksum_method_node_list_balances( ): Short fun uniffi_ldk_node_checksum_method_node_list_channels( ): Short + fun uniffi_ldk_node_checksum_method_node_list_monitored_address_types( + ): Short fun uniffi_ldk_node_checksum_method_node_list_payments( ): Short fun uniffi_ldk_node_checksum_method_node_list_peers( @@ -3223,6 +3265,8 @@ internal interface UniffiLib : Library { ): Short fun uniffi_ldk_node_checksum_method_onchainpayment_new_address( ): Short + fun uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type( + ): Short fun uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm( ): Short fun uniffi_ldk_node_checksum_method_onchainpayment_send_all_to_address( @@ -3508,6 +3552,12 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider() != 9090.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_builder_set_address_type() != 647.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } + if (lib.uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor() != 23561.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_builder_set_announcement_addresses() != 39271.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -3640,6 +3690,9 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_node_get_address_balance() != 45284.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_node_get_balance_for_address_type() != 34906.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_node_get_transaction_details() != 65000.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -3649,6 +3702,9 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_node_list_channels() != 7954.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_node_list_monitored_address_types() != 25084.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_node_list_payments() != 35002.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -3784,6 +3840,9 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_onchainpayment_new_address() != 37251.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type() != 9083.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm() != 14084.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -5641,6 +5700,30 @@ open class Builder: Disposable, BuilderInterface { }) } + override fun `setAddressType`(`addressType`: AddressType) { + callWithPointer { + uniffiRustCall { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_builder_set_address_type( + it, + FfiConverterTypeAddressType.lower(`addressType`), + uniffiRustCallStatus, + ) + } + } + } + + override fun `setAddressTypesToMonitor`(`addressTypesToMonitor`: List) { + callWithPointer { + uniffiRustCall { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor( + it, + FfiConverterSequenceTypeAddressType.lower(`addressTypesToMonitor`), + uniffiRustCallStatus, + ) + } + } + } + @Throws(BuildException::class) override fun `setAnnouncementAddresses`(`announcementAddresses`: List) { callWithPointer { @@ -6953,6 +7036,19 @@ open class Node: Disposable, NodeInterface { }) } + @Throws(NodeException::class) + override fun `getBalanceForAddressType`(`addressType`: AddressType): AddressTypeBalance { + return FfiConverterTypeAddressTypeBalance.lift(callWithPointer { + uniffiRustCallWithError(NodeExceptionErrorHandler) { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_node_get_balance_for_address_type( + it, + FfiConverterTypeAddressType.lower(`addressType`), + uniffiRustCallStatus, + ) + } + }) + } + override fun `getTransactionDetails`(`txid`: Txid): TransactionDetails? { return FfiConverterOptionalTypeTransactionDetails.lift(callWithPointer { uniffiRustCall { uniffiRustCallStatus -> @@ -6987,6 +7083,17 @@ open class Node: Disposable, NodeInterface { }) } + override fun `listMonitoredAddressTypes`(): List { + return FfiConverterSequenceTypeAddressType.lift(callWithPointer { + uniffiRustCall { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_node_list_monitored_address_types( + it, + uniffiRustCallStatus, + ) + } + }) + } + override fun `listPayments`(): List { return FfiConverterSequenceTypePaymentDetails.lift(callWithPointer { uniffiRustCall { uniffiRustCallStatus -> @@ -7846,6 +7953,19 @@ open class OnchainPayment: Disposable, OnchainPaymentInterface { }) } + @Throws(NodeException::class) + override fun `newAddressForType`(`addressType`: AddressType): Address { + return FfiConverterTypeAddress.lift(callWithPointer { + uniffiRustCallWithError(NodeExceptionErrorHandler) { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type( + it, + FfiConverterTypeAddressType.lower(`addressType`), + uniffiRustCallStatus, + ) + } + }) + } + @Throws(NodeException::class) override fun `selectUtxosWithAlgorithm`(`targetAmountSats`: kotlin.ULong, `feeRate`: FeeRate?, `algorithm`: CoinSelectionAlgorithm, `utxos`: List?): List { return FfiConverterSequenceTypeSpendableUtxo.lift(callWithPointer { @@ -8737,6 +8857,28 @@ object FfiConverterTypeVssHeaderProvider: FfiConverter { + override fun read(buf: ByteBuffer): AddressTypeBalance { + return AddressTypeBalance( + FfiConverterULong.read(buf), + FfiConverterULong.read(buf), + ) + } + + override fun allocationSize(value: AddressTypeBalance) = ( + FfiConverterULong.allocationSize(value.`totalSats`) + + FfiConverterULong.allocationSize(value.`spendableSats`) + ) + + override fun write(value: AddressTypeBalance, buf: ByteBuffer) { + FfiConverterULong.write(value.`totalSats`, buf) + FfiConverterULong.write(value.`spendableSats`, buf) + } +} + + + + object FfiConverterTypeAnchorChannelsConfig: FfiConverterRustBuffer { override fun read(buf: ByteBuffer): AnchorChannelsConfig { return AnchorChannelsConfig( @@ -9086,6 +9228,8 @@ object FfiConverterTypeConfig: FfiConverterRustBuffer { FfiConverterOptionalTypeAnchorChannelsConfig.read(buf), FfiConverterOptionalTypeRouteParametersConfig.read(buf), FfiConverterBoolean.read(buf), + FfiConverterTypeAddressType.read(buf), + FfiConverterSequenceTypeAddressType.read(buf), ) } @@ -9099,7 +9243,9 @@ object FfiConverterTypeConfig: FfiConverterRustBuffer { FfiConverterULong.allocationSize(value.`probingLiquidityLimitMultiplier`) + FfiConverterOptionalTypeAnchorChannelsConfig.allocationSize(value.`anchorChannelsConfig`) + FfiConverterOptionalTypeRouteParametersConfig.allocationSize(value.`routeParameters`) + - FfiConverterBoolean.allocationSize(value.`includeUntrustedPendingInSpendable`) + FfiConverterBoolean.allocationSize(value.`includeUntrustedPendingInSpendable`) + + FfiConverterTypeAddressType.allocationSize(value.`addressType`) + + FfiConverterSequenceTypeAddressType.allocationSize(value.`addressTypesToMonitor`) ) override fun write(value: Config, buf: ByteBuffer) { @@ -9113,6 +9259,8 @@ object FfiConverterTypeConfig: FfiConverterRustBuffer { FfiConverterOptionalTypeAnchorChannelsConfig.write(value.`anchorChannelsConfig`, buf) FfiConverterOptionalTypeRouteParametersConfig.write(value.`routeParameters`, buf) FfiConverterBoolean.write(value.`includeUntrustedPendingInSpendable`, buf) + FfiConverterTypeAddressType.write(value.`addressType`, buf) + FfiConverterSequenceTypeAddressType.write(value.`addressTypesToMonitor`, buf) } } @@ -9854,6 +10002,24 @@ object FfiConverterTypeTxOutput: FfiConverterRustBuffer { +object FfiConverterTypeAddressType: FfiConverterRustBuffer { + override fun read(buf: ByteBuffer) = try { + AddressType.entries[buf.getInt() - 1] + } catch (e: IndexOutOfBoundsException) { + throw RuntimeException("invalid enum value, something is very wrong!!", e) + } + + override fun allocationSize(value: AddressType) = 4UL + + override fun write(value: AddressType, buf: ByteBuffer) { + buf.putInt(value.ordinal + 1) + } +} + + + + + object FfiConverterTypeAsyncPaymentsRole: FfiConverterRustBuffer { override fun read(buf: ByteBuffer) = try { AsyncPaymentsRole.entries[buf.getInt() - 1] @@ -13487,6 +13653,31 @@ object FfiConverterSequenceTypeTxOutput: FfiConverterRustBuffer> +object FfiConverterSequenceTypeAddressType: FfiConverterRustBuffer> { + override fun read(buf: ByteBuffer): List { + val len = buf.getInt() + return List(len) { + FfiConverterTypeAddressType.read(buf) + } + } + + override fun allocationSize(value: List): ULong { + val sizeForLength = 4UL + val sizeForItems = value.sumOf { FfiConverterTypeAddressType.allocationSize(it) } + return sizeForLength + sizeForItems + } + + override fun write(value: List, buf: ByteBuffer) { + buf.putInt(value.size) + value.iterator().forEach { + FfiConverterTypeAddressType.write(it, buf) + } + } +} + + + + object FfiConverterSequenceTypeLightningBalance: FfiConverterRustBuffer> { override fun read(buf: ByteBuffer): List { val len = buf.getInt() diff --git a/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt b/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt index 572ca6180..66177cbd0 100644 --- a/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt +++ b/bindings/kotlin/ldk-node-android/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt @@ -299,6 +299,10 @@ interface BuilderInterface { @Throws(BuildException::class) fun `buildWithVssStoreAndHeaderProvider`(`vssUrl`: kotlin.String, `storeId`: kotlin.String, `headerProvider`: VssHeaderProvider): Node + fun `setAddressType`(`addressType`: AddressType) + + fun `setAddressTypesToMonitor`(`addressTypesToMonitor`: List) + @Throws(BuildException::class) fun `setAnnouncementAddresses`(`announcementAddresses`: List) @@ -441,12 +445,17 @@ interface NodeInterface { @Throws(NodeException::class) fun `getAddressBalance`(`addressStr`: kotlin.String): kotlin.ULong + @Throws(NodeException::class) + fun `getBalanceForAddressType`(`addressType`: AddressType): AddressTypeBalance + fun `getTransactionDetails`(`txid`: Txid): TransactionDetails? fun `listBalances`(): BalanceDetails fun `listChannels`(): List + fun `listMonitoredAddressTypes`(): List + fun `listPayments`(): List fun `listPeers`(): List @@ -569,6 +578,9 @@ interface OnchainPaymentInterface { @Throws(NodeException::class) fun `newAddress`(): Address + @Throws(NodeException::class) + fun `newAddressForType`(`addressType`: AddressType): Address + @Throws(NodeException::class) fun `selectUtxosWithAlgorithm`(`targetAmountSats`: kotlin.ULong, `feeRate`: FeeRate?, `algorithm`: CoinSelectionAlgorithm, `utxos`: List?): List @@ -660,6 +672,16 @@ interface VssHeaderProviderInterface { +@kotlinx.serialization.Serializable +data class AddressTypeBalance ( + val `totalSats`: kotlin.ULong, + val `spendableSats`: kotlin.ULong +) { + companion object +} + + + @kotlinx.serialization.Serializable data class AnchorChannelsConfig ( val `trustedPeersNoReserve`: List, @@ -807,7 +829,9 @@ data class Config ( val `probingLiquidityLimitMultiplier`: kotlin.ULong, val `anchorChannelsConfig`: AnchorChannelsConfig?, val `routeParameters`: RouteParametersConfig?, - val `includeUntrustedPendingInSpendable`: kotlin.Boolean + val `includeUntrustedPendingInSpendable`: kotlin.Boolean, + val `addressType`: AddressType, + val `addressTypesToMonitor`: List ) { companion object } @@ -1165,6 +1189,22 @@ data class TxOutput ( +@kotlinx.serialization.Serializable +enum class AddressType { + + LEGACY, + NESTED_SEGWIT, + NATIVE_SEGWIT, + TAPROOT; + companion object +} + + + + + + + @kotlinx.serialization.Serializable enum class AsyncPaymentsRole { @@ -2174,6 +2214,8 @@ enum class WordCount { + + diff --git a/bindings/kotlin/ldk-node-jvm/gradle.properties b/bindings/kotlin/ldk-node-jvm/gradle.properties index f2e17b9f8..a81a869d3 100644 --- a/bindings/kotlin/ldk-node-jvm/gradle.properties +++ b/bindings/kotlin/ldk-node-jvm/gradle.properties @@ -1,3 +1,3 @@ org.gradle.jvmargs=-Xmx1536m kotlin.code.style=official -libraryVersion=0.7.0-rc.18 +libraryVersion=0.7.0-rc.19 diff --git a/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt b/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt index 572ca6180..66177cbd0 100644 --- a/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt +++ b/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.common.kt @@ -299,6 +299,10 @@ interface BuilderInterface { @Throws(BuildException::class) fun `buildWithVssStoreAndHeaderProvider`(`vssUrl`: kotlin.String, `storeId`: kotlin.String, `headerProvider`: VssHeaderProvider): Node + fun `setAddressType`(`addressType`: AddressType) + + fun `setAddressTypesToMonitor`(`addressTypesToMonitor`: List) + @Throws(BuildException::class) fun `setAnnouncementAddresses`(`announcementAddresses`: List) @@ -441,12 +445,17 @@ interface NodeInterface { @Throws(NodeException::class) fun `getAddressBalance`(`addressStr`: kotlin.String): kotlin.ULong + @Throws(NodeException::class) + fun `getBalanceForAddressType`(`addressType`: AddressType): AddressTypeBalance + fun `getTransactionDetails`(`txid`: Txid): TransactionDetails? fun `listBalances`(): BalanceDetails fun `listChannels`(): List + fun `listMonitoredAddressTypes`(): List + fun `listPayments`(): List fun `listPeers`(): List @@ -569,6 +578,9 @@ interface OnchainPaymentInterface { @Throws(NodeException::class) fun `newAddress`(): Address + @Throws(NodeException::class) + fun `newAddressForType`(`addressType`: AddressType): Address + @Throws(NodeException::class) fun `selectUtxosWithAlgorithm`(`targetAmountSats`: kotlin.ULong, `feeRate`: FeeRate?, `algorithm`: CoinSelectionAlgorithm, `utxos`: List?): List @@ -660,6 +672,16 @@ interface VssHeaderProviderInterface { +@kotlinx.serialization.Serializable +data class AddressTypeBalance ( + val `totalSats`: kotlin.ULong, + val `spendableSats`: kotlin.ULong +) { + companion object +} + + + @kotlinx.serialization.Serializable data class AnchorChannelsConfig ( val `trustedPeersNoReserve`: List, @@ -807,7 +829,9 @@ data class Config ( val `probingLiquidityLimitMultiplier`: kotlin.ULong, val `anchorChannelsConfig`: AnchorChannelsConfig?, val `routeParameters`: RouteParametersConfig?, - val `includeUntrustedPendingInSpendable`: kotlin.Boolean + val `includeUntrustedPendingInSpendable`: kotlin.Boolean, + val `addressType`: AddressType, + val `addressTypesToMonitor`: List ) { companion object } @@ -1165,6 +1189,22 @@ data class TxOutput ( +@kotlinx.serialization.Serializable +enum class AddressType { + + LEGACY, + NESTED_SEGWIT, + NATIVE_SEGWIT, + TAPROOT; + companion object +} + + + + + + + @kotlinx.serialization.Serializable enum class AsyncPaymentsRole { @@ -2174,6 +2214,8 @@ enum class WordCount { + + diff --git a/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.jvm.kt b/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.jvm.kt index 050492d2f..3a2430505 100644 --- a/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.jvm.kt +++ b/bindings/kotlin/ldk-node-jvm/lib/src/main/kotlin/org/lightningdevkit/ldknode/ldk_node.jvm.kt @@ -1485,6 +1485,16 @@ internal typealias UniffiVTableCallbackInterfaceVssHeaderProviderUniffiByValue = + + + + + + + + + + @@ -1947,6 +1957,16 @@ internal interface UniffiLib : Library { `headerProvider`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, ): Pointer? + fun uniffi_ldk_node_fn_method_builder_set_address_type( + `ptr`: Pointer?, + `addressType`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): Unit + fun uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor( + `ptr`: Pointer?, + `addressTypesToMonitor`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): Unit fun uniffi_ldk_node_fn_method_builder_set_announcement_addresses( `ptr`: Pointer?, `announcementAddresses`: RustBufferByValue, @@ -2228,6 +2248,11 @@ internal interface UniffiLib : Library { `addressStr`: RustBufferByValue, uniffiCallStatus: UniffiRustCallStatus, ): Long + fun uniffi_ldk_node_fn_method_node_get_balance_for_address_type( + `ptr`: Pointer?, + `addressType`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): RustBufferByValue fun uniffi_ldk_node_fn_method_node_get_transaction_details( `ptr`: Pointer?, `txid`: RustBufferByValue, @@ -2241,6 +2266,10 @@ internal interface UniffiLib : Library { `ptr`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, ): RustBufferByValue + fun uniffi_ldk_node_fn_method_node_list_monitored_address_types( + `ptr`: Pointer?, + uniffiCallStatus: UniffiRustCallStatus, + ): RustBufferByValue fun uniffi_ldk_node_fn_method_node_list_payments( `ptr`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, @@ -2498,6 +2527,11 @@ internal interface UniffiLib : Library { `ptr`: Pointer?, uniffiCallStatus: UniffiRustCallStatus, ): RustBufferByValue + fun uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type( + `ptr`: Pointer?, + `addressType`: RustBufferByValue, + uniffiCallStatus: UniffiRustCallStatus, + ): RustBufferByValue fun uniffi_ldk_node_fn_method_onchainpayment_select_utxos_with_algorithm( `ptr`: Pointer?, `targetAmountSats`: Long, @@ -3037,6 +3071,10 @@ internal interface UniffiLib : Library { ): Short fun uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider( ): Short + fun uniffi_ldk_node_checksum_method_builder_set_address_type( + ): Short + fun uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor( + ): Short fun uniffi_ldk_node_checksum_method_builder_set_announcement_addresses( ): Short fun uniffi_ldk_node_checksum_method_builder_set_async_payments_role( @@ -3125,12 +3163,16 @@ internal interface UniffiLib : Library { ): Short fun uniffi_ldk_node_checksum_method_node_get_address_balance( ): Short + fun uniffi_ldk_node_checksum_method_node_get_balance_for_address_type( + ): Short fun uniffi_ldk_node_checksum_method_node_get_transaction_details( ): Short fun uniffi_ldk_node_checksum_method_node_list_balances( ): Short fun uniffi_ldk_node_checksum_method_node_list_channels( ): Short + fun uniffi_ldk_node_checksum_method_node_list_monitored_address_types( + ): Short fun uniffi_ldk_node_checksum_method_node_list_payments( ): Short fun uniffi_ldk_node_checksum_method_node_list_peers( @@ -3221,6 +3263,8 @@ internal interface UniffiLib : Library { ): Short fun uniffi_ldk_node_checksum_method_onchainpayment_new_address( ): Short + fun uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type( + ): Short fun uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm( ): Short fun uniffi_ldk_node_checksum_method_onchainpayment_send_all_to_address( @@ -3506,6 +3550,12 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider() != 9090.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_builder_set_address_type() != 647.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } + if (lib.uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor() != 23561.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_builder_set_announcement_addresses() != 39271.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -3638,6 +3688,9 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_node_get_address_balance() != 45284.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_node_get_balance_for_address_type() != 34906.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_node_get_transaction_details() != 65000.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -3647,6 +3700,9 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_node_list_channels() != 7954.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_node_list_monitored_address_types() != 25084.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_node_list_payments() != 35002.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -3782,6 +3838,9 @@ private fun uniffiCheckApiChecksums(lib: UniffiLib) { if (lib.uniffi_ldk_node_checksum_method_onchainpayment_new_address() != 37251.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } + if (lib.uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type() != 9083.toShort()) { + throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + } if (lib.uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm() != 14084.toShort()) { throw RuntimeException("UniFFI API checksum mismatch: try cleaning and rebuilding your project") } @@ -5630,6 +5689,30 @@ open class Builder: Disposable, BuilderInterface { }) } + override fun `setAddressType`(`addressType`: AddressType) { + callWithPointer { + uniffiRustCall { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_builder_set_address_type( + it, + FfiConverterTypeAddressType.lower(`addressType`), + uniffiRustCallStatus, + ) + } + } + } + + override fun `setAddressTypesToMonitor`(`addressTypesToMonitor`: List) { + callWithPointer { + uniffiRustCall { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor( + it, + FfiConverterSequenceTypeAddressType.lower(`addressTypesToMonitor`), + uniffiRustCallStatus, + ) + } + } + } + @Throws(BuildException::class) override fun `setAnnouncementAddresses`(`announcementAddresses`: List) { callWithPointer { @@ -6942,6 +7025,19 @@ open class Node: Disposable, NodeInterface { }) } + @Throws(NodeException::class) + override fun `getBalanceForAddressType`(`addressType`: AddressType): AddressTypeBalance { + return FfiConverterTypeAddressTypeBalance.lift(callWithPointer { + uniffiRustCallWithError(NodeExceptionErrorHandler) { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_node_get_balance_for_address_type( + it, + FfiConverterTypeAddressType.lower(`addressType`), + uniffiRustCallStatus, + ) + } + }) + } + override fun `getTransactionDetails`(`txid`: Txid): TransactionDetails? { return FfiConverterOptionalTypeTransactionDetails.lift(callWithPointer { uniffiRustCall { uniffiRustCallStatus -> @@ -6976,6 +7072,17 @@ open class Node: Disposable, NodeInterface { }) } + override fun `listMonitoredAddressTypes`(): List { + return FfiConverterSequenceTypeAddressType.lift(callWithPointer { + uniffiRustCall { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_node_list_monitored_address_types( + it, + uniffiRustCallStatus, + ) + } + }) + } + override fun `listPayments`(): List { return FfiConverterSequenceTypePaymentDetails.lift(callWithPointer { uniffiRustCall { uniffiRustCallStatus -> @@ -7835,6 +7942,19 @@ open class OnchainPayment: Disposable, OnchainPaymentInterface { }) } + @Throws(NodeException::class) + override fun `newAddressForType`(`addressType`: AddressType): Address { + return FfiConverterTypeAddress.lift(callWithPointer { + uniffiRustCallWithError(NodeExceptionErrorHandler) { uniffiRustCallStatus -> + UniffiLib.INSTANCE.uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type( + it, + FfiConverterTypeAddressType.lower(`addressType`), + uniffiRustCallStatus, + ) + } + }) + } + @Throws(NodeException::class) override fun `selectUtxosWithAlgorithm`(`targetAmountSats`: kotlin.ULong, `feeRate`: FeeRate?, `algorithm`: CoinSelectionAlgorithm, `utxos`: List?): List { return FfiConverterSequenceTypeSpendableUtxo.lift(callWithPointer { @@ -8726,6 +8846,28 @@ object FfiConverterTypeVssHeaderProvider: FfiConverter { + override fun read(buf: ByteBuffer): AddressTypeBalance { + return AddressTypeBalance( + FfiConverterULong.read(buf), + FfiConverterULong.read(buf), + ) + } + + override fun allocationSize(value: AddressTypeBalance) = ( + FfiConverterULong.allocationSize(value.`totalSats`) + + FfiConverterULong.allocationSize(value.`spendableSats`) + ) + + override fun write(value: AddressTypeBalance, buf: ByteBuffer) { + FfiConverterULong.write(value.`totalSats`, buf) + FfiConverterULong.write(value.`spendableSats`, buf) + } +} + + + + object FfiConverterTypeAnchorChannelsConfig: FfiConverterRustBuffer { override fun read(buf: ByteBuffer): AnchorChannelsConfig { return AnchorChannelsConfig( @@ -9075,6 +9217,8 @@ object FfiConverterTypeConfig: FfiConverterRustBuffer { FfiConverterOptionalTypeAnchorChannelsConfig.read(buf), FfiConverterOptionalTypeRouteParametersConfig.read(buf), FfiConverterBoolean.read(buf), + FfiConverterTypeAddressType.read(buf), + FfiConverterSequenceTypeAddressType.read(buf), ) } @@ -9088,7 +9232,9 @@ object FfiConverterTypeConfig: FfiConverterRustBuffer { FfiConverterULong.allocationSize(value.`probingLiquidityLimitMultiplier`) + FfiConverterOptionalTypeAnchorChannelsConfig.allocationSize(value.`anchorChannelsConfig`) + FfiConverterOptionalTypeRouteParametersConfig.allocationSize(value.`routeParameters`) + - FfiConverterBoolean.allocationSize(value.`includeUntrustedPendingInSpendable`) + FfiConverterBoolean.allocationSize(value.`includeUntrustedPendingInSpendable`) + + FfiConverterTypeAddressType.allocationSize(value.`addressType`) + + FfiConverterSequenceTypeAddressType.allocationSize(value.`addressTypesToMonitor`) ) override fun write(value: Config, buf: ByteBuffer) { @@ -9102,6 +9248,8 @@ object FfiConverterTypeConfig: FfiConverterRustBuffer { FfiConverterOptionalTypeAnchorChannelsConfig.write(value.`anchorChannelsConfig`, buf) FfiConverterOptionalTypeRouteParametersConfig.write(value.`routeParameters`, buf) FfiConverterBoolean.write(value.`includeUntrustedPendingInSpendable`, buf) + FfiConverterTypeAddressType.write(value.`addressType`, buf) + FfiConverterSequenceTypeAddressType.write(value.`addressTypesToMonitor`, buf) } } @@ -9843,6 +9991,24 @@ object FfiConverterTypeTxOutput: FfiConverterRustBuffer { +object FfiConverterTypeAddressType: FfiConverterRustBuffer { + override fun read(buf: ByteBuffer) = try { + AddressType.entries[buf.getInt() - 1] + } catch (e: IndexOutOfBoundsException) { + throw RuntimeException("invalid enum value, something is very wrong!!", e) + } + + override fun allocationSize(value: AddressType) = 4UL + + override fun write(value: AddressType, buf: ByteBuffer) { + buf.putInt(value.ordinal + 1) + } +} + + + + + object FfiConverterTypeAsyncPaymentsRole: FfiConverterRustBuffer { override fun read(buf: ByteBuffer) = try { AsyncPaymentsRole.entries[buf.getInt() - 1] @@ -13476,6 +13642,31 @@ object FfiConverterSequenceTypeTxOutput: FfiConverterRustBuffer> +object FfiConverterSequenceTypeAddressType: FfiConverterRustBuffer> { + override fun read(buf: ByteBuffer): List { + val len = buf.getInt() + return List(len) { + FfiConverterTypeAddressType.read(buf) + } + } + + override fun allocationSize(value: List): ULong { + val sizeForLength = 4UL + val sizeForItems = value.sumOf { FfiConverterTypeAddressType.allocationSize(it) } + return sizeForLength + sizeForItems + } + + override fun write(value: List, buf: ByteBuffer) { + buf.putInt(value.size) + value.iterator().forEach { + FfiConverterTypeAddressType.write(it, buf) + } + } +} + + + + object FfiConverterSequenceTypeLightningBalance: FfiConverterRustBuffer> { override fun read(buf: ByteBuffer): List { val len = buf.getInt() diff --git a/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-aarch64/libldk_node.dylib b/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-aarch64/libldk_node.dylib index b77fef4a4..c540252e7 100644 Binary files a/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-aarch64/libldk_node.dylib and b/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-aarch64/libldk_node.dylib differ diff --git a/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-x86-64/libldk_node.dylib b/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-x86-64/libldk_node.dylib index c9b9a44e9..5b744a98b 100644 Binary files a/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-x86-64/libldk_node.dylib and b/bindings/kotlin/ldk-node-jvm/lib/src/main/resources/darwin-x86-64/libldk_node.dylib differ diff --git a/bindings/ldk_node.udl b/bindings/ldk_node.udl index 2cfdf54c1..d5cd8e144 100644 --- a/bindings/ldk_node.udl +++ b/bindings/ldk_node.udl @@ -17,6 +17,8 @@ dictionary Config { AnchorChannelsConfig? anchor_channels_config; RouteParametersConfig? route_parameters; boolean include_untrusted_pending_in_spendable; + AddressType address_type; + sequence address_types_to_monitor; }; dictionary AnchorChannelsConfig { @@ -65,6 +67,13 @@ enum WordCount { "Words24", }; +enum AddressType { + "Legacy", + "NestedSegwit", + "NativeSegwit", + "Taproot", +}; + enum LogLevel { "Gossip", "Trace", @@ -114,6 +123,8 @@ interface Builder { void set_log_facade_logger(); void set_custom_logger(LogWriter log_writer); void set_network(Network network); + void set_address_type(AddressType address_type); + void set_address_types_to_monitor(sequence address_types_to_monitor); [Throws=BuildError] void set_listening_addresses(sequence listening_addresses); [Throws=BuildError] @@ -184,6 +195,9 @@ interface Node { [Throws=NodeError] void remove_payment([ByRef]PaymentId payment_id); BalanceDetails list_balances(); + [Throws=NodeError] + AddressTypeBalance get_balance_for_address_type(AddressType address_type); + sequence list_monitored_address_types(); sequence list_payments(); sequence list_peers(); sequence list_channels(); @@ -276,6 +290,8 @@ interface OnchainPayment { [Throws=NodeError] Address new_address(); [Throws=NodeError] + Address new_address_for_type(AddressType address_type); + [Throws=NodeError] sequence list_spendable_outputs(); [Throws=NodeError] sequence select_utxos_with_algorithm(u64 target_amount_sats, FeeRate? fee_rate, CoinSelectionAlgorithm algorithm, sequence? utxos); @@ -392,8 +408,8 @@ enum NodeError { "TransactionAlreadyConfirmed", "NoSpendableOutputs", "CoinSelectionFailed", - "InvalidMnemonic", - "BackgroundSyncNotEnabled", + "InvalidMnemonic", + "BackgroundSyncNotEnabled", }; dictionary NodeStatus { @@ -771,6 +787,11 @@ dictionary BalanceDetails { sequence pending_balances_from_channel_closures; }; +dictionary AddressTypeBalance { + u64 total_sats; + u64 spendable_sats; +}; + dictionary ChannelConfig { u32 forwarding_fee_proportional_millionths; u32 forwarding_fee_base_msat; diff --git a/bindings/python/pyproject.toml b/bindings/python/pyproject.toml index 079fbce67..2a7496226 100644 --- a/bindings/python/pyproject.toml +++ b/bindings/python/pyproject.toml @@ -1,6 +1,6 @@ [project] name = "ldk_node" -version = "0.7.0-rc.18" +version = "0.7.0-rc.19" authors = [ { name="Elias Rohrer", email="dev@tnull.de" }, ] diff --git a/bindings/python/src/ldk_node/ldk_node.py b/bindings/python/src/ldk_node/ldk_node.py index facc250f5..dd32add08 100644 --- a/bindings/python/src/ldk_node/ldk_node.py +++ b/bindings/python/src/ldk_node/ldk_node.py @@ -601,6 +601,10 @@ def _uniffi_check_api_checksums(lib): raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider() != 9090: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + if lib.uniffi_ldk_node_checksum_method_builder_set_address_type() != 647: + raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + if lib.uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor() != 23561: + raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_builder_set_announcement_addresses() != 39271: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_builder_set_async_payments_role() != 16463: @@ -689,12 +693,16 @@ def _uniffi_check_api_checksums(lib): raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_node_get_address_balance() != 45284: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + if lib.uniffi_ldk_node_checksum_method_node_get_balance_for_address_type() != 34906: + raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_node_get_transaction_details() != 65000: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_node_list_balances() != 57528: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_node_list_channels() != 7954: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + if lib.uniffi_ldk_node_checksum_method_node_list_monitored_address_types() != 25084: + raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_node_list_payments() != 35002: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_node_list_peers() != 14889: @@ -785,6 +793,8 @@ def _uniffi_check_api_checksums(lib): raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_onchainpayment_new_address() != 37251: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") + if lib.uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type() != 9083: + raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm() != 14084: raise InternalError("UniFFI API checksum mismatch: try cleaning and rebuilding your project") if lib.uniffi_ldk_node_checksum_method_onchainpayment_send_all_to_address() != 37748: @@ -1464,6 +1474,18 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): ctypes.POINTER(_UniffiRustCallStatus), ) _UniffiLib.uniffi_ldk_node_fn_method_builder_build_with_vss_store_and_header_provider.restype = ctypes.c_void_p +_UniffiLib.uniffi_ldk_node_fn_method_builder_set_address_type.argtypes = ( + ctypes.c_void_p, + _UniffiRustBuffer, + ctypes.POINTER(_UniffiRustCallStatus), +) +_UniffiLib.uniffi_ldk_node_fn_method_builder_set_address_type.restype = None +_UniffiLib.uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor.argtypes = ( + ctypes.c_void_p, + _UniffiRustBuffer, + ctypes.POINTER(_UniffiRustCallStatus), +) +_UniffiLib.uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor.restype = None _UniffiLib.uniffi_ldk_node_fn_method_builder_set_announcement_addresses.argtypes = ( ctypes.c_void_p, _UniffiRustBuffer, @@ -1802,6 +1824,12 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): ctypes.POINTER(_UniffiRustCallStatus), ) _UniffiLib.uniffi_ldk_node_fn_method_node_get_address_balance.restype = ctypes.c_uint64 +_UniffiLib.uniffi_ldk_node_fn_method_node_get_balance_for_address_type.argtypes = ( + ctypes.c_void_p, + _UniffiRustBuffer, + ctypes.POINTER(_UniffiRustCallStatus), +) +_UniffiLib.uniffi_ldk_node_fn_method_node_get_balance_for_address_type.restype = _UniffiRustBuffer _UniffiLib.uniffi_ldk_node_fn_method_node_get_transaction_details.argtypes = ( ctypes.c_void_p, _UniffiRustBuffer, @@ -1818,6 +1846,11 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): ctypes.POINTER(_UniffiRustCallStatus), ) _UniffiLib.uniffi_ldk_node_fn_method_node_list_channels.restype = _UniffiRustBuffer +_UniffiLib.uniffi_ldk_node_fn_method_node_list_monitored_address_types.argtypes = ( + ctypes.c_void_p, + ctypes.POINTER(_UniffiRustCallStatus), +) +_UniffiLib.uniffi_ldk_node_fn_method_node_list_monitored_address_types.restype = _UniffiRustBuffer _UniffiLib.uniffi_ldk_node_fn_method_node_list_payments.argtypes = ( ctypes.c_void_p, ctypes.POINTER(_UniffiRustCallStatus), @@ -2129,6 +2162,12 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): ctypes.POINTER(_UniffiRustCallStatus), ) _UniffiLib.uniffi_ldk_node_fn_method_onchainpayment_new_address.restype = _UniffiRustBuffer +_UniffiLib.uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type.argtypes = ( + ctypes.c_void_p, + _UniffiRustBuffer, + ctypes.POINTER(_UniffiRustCallStatus), +) +_UniffiLib.uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type.restype = _UniffiRustBuffer _UniffiLib.uniffi_ldk_node_fn_method_onchainpayment_select_utxos_with_algorithm.argtypes = ( ctypes.c_void_p, ctypes.c_uint64, @@ -2832,6 +2871,12 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): _UniffiLib.uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider.restype = ctypes.c_uint16 +_UniffiLib.uniffi_ldk_node_checksum_method_builder_set_address_type.argtypes = ( +) +_UniffiLib.uniffi_ldk_node_checksum_method_builder_set_address_type.restype = ctypes.c_uint16 +_UniffiLib.uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor.argtypes = ( +) +_UniffiLib.uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor.restype = ctypes.c_uint16 _UniffiLib.uniffi_ldk_node_checksum_method_builder_set_announcement_addresses.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_builder_set_announcement_addresses.restype = ctypes.c_uint16 @@ -2964,6 +3009,9 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): _UniffiLib.uniffi_ldk_node_checksum_method_node_get_address_balance.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_node_get_address_balance.restype = ctypes.c_uint16 +_UniffiLib.uniffi_ldk_node_checksum_method_node_get_balance_for_address_type.argtypes = ( +) +_UniffiLib.uniffi_ldk_node_checksum_method_node_get_balance_for_address_type.restype = ctypes.c_uint16 _UniffiLib.uniffi_ldk_node_checksum_method_node_get_transaction_details.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_node_get_transaction_details.restype = ctypes.c_uint16 @@ -2973,6 +3021,9 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): _UniffiLib.uniffi_ldk_node_checksum_method_node_list_channels.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_node_list_channels.restype = ctypes.c_uint16 +_UniffiLib.uniffi_ldk_node_checksum_method_node_list_monitored_address_types.argtypes = ( +) +_UniffiLib.uniffi_ldk_node_checksum_method_node_list_monitored_address_types.restype = ctypes.c_uint16 _UniffiLib.uniffi_ldk_node_checksum_method_node_list_payments.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_node_list_payments.restype = ctypes.c_uint16 @@ -3108,6 +3159,9 @@ class _UniffiVTableCallbackInterfaceVssHeaderProvider(ctypes.Structure): _UniffiLib.uniffi_ldk_node_checksum_method_onchainpayment_new_address.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_onchainpayment_new_address.restype = ctypes.c_uint16 +_UniffiLib.uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type.argtypes = ( +) +_UniffiLib.uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type.restype = ctypes.c_uint16 _UniffiLib.uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm.argtypes = ( ) _UniffiLib.uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm.restype = ctypes.c_uint16 @@ -4514,6 +4568,10 @@ def build_with_vss_store_and_fixed_headers(self, vss_url: "str",store_id: "str", raise NotImplementedError def build_with_vss_store_and_header_provider(self, vss_url: "str",store_id: "str",header_provider: "VssHeaderProvider"): raise NotImplementedError + def set_address_type(self, address_type: "AddressType"): + raise NotImplementedError + def set_address_types_to_monitor(self, address_types_to_monitor: "typing.List[AddressType]"): + raise NotImplementedError def set_announcement_addresses(self, announcement_addresses: "typing.List[SocketAddress]"): raise NotImplementedError def set_async_payments_role(self, role: "typing.Optional[AsyncPaymentsRole]"): @@ -4668,6 +4726,28 @@ def build_with_vss_store_and_header_provider(self, vss_url: "str",store_id: "str + def set_address_type(self, address_type: "AddressType") -> None: + _UniffiConverterTypeAddressType.check_lower(address_type) + + _uniffi_rust_call(_UniffiLib.uniffi_ldk_node_fn_method_builder_set_address_type,self._uniffi_clone_pointer(), + _UniffiConverterTypeAddressType.lower(address_type)) + + + + + + + def set_address_types_to_monitor(self, address_types_to_monitor: "typing.List[AddressType]") -> None: + _UniffiConverterSequenceTypeAddressType.check_lower(address_types_to_monitor) + + _uniffi_rust_call(_UniffiLib.uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor,self._uniffi_clone_pointer(), + _UniffiConverterSequenceTypeAddressType.lower(address_types_to_monitor)) + + + + + + def set_announcement_addresses(self, announcement_addresses: "typing.List[SocketAddress]") -> None: _UniffiConverterSequenceTypeSocketAddress.check_lower(announcement_addresses) @@ -5463,12 +5543,16 @@ def force_close_channel(self, user_channel_id: "UserChannelId",counterparty_node raise NotImplementedError def get_address_balance(self, address_str: "str"): raise NotImplementedError + def get_balance_for_address_type(self, address_type: "AddressType"): + raise NotImplementedError def get_transaction_details(self, txid: "Txid"): raise NotImplementedError def list_balances(self, ): raise NotImplementedError def list_channels(self, ): raise NotImplementedError + def list_monitored_address_types(self, ): + raise NotImplementedError def list_payments(self, ): raise NotImplementedError def list_peers(self, ): @@ -5683,6 +5767,18 @@ def get_address_balance(self, address_str: "str") -> "int": + def get_balance_for_address_type(self, address_type: "AddressType") -> "AddressTypeBalance": + _UniffiConverterTypeAddressType.check_lower(address_type) + + return _UniffiConverterTypeAddressTypeBalance.lift( + _uniffi_rust_call_with_error(_UniffiConverterTypeNodeError,_UniffiLib.uniffi_ldk_node_fn_method_node_get_balance_for_address_type,self._uniffi_clone_pointer(), + _UniffiConverterTypeAddressType.lower(address_type)) + ) + + + + + def get_transaction_details(self, txid: "Txid") -> "typing.Optional[TransactionDetails]": _UniffiConverterTypeTxid.check_lower(txid) @@ -5713,6 +5809,15 @@ def list_channels(self, ) -> "typing.List[ChannelDetails]": + def list_monitored_address_types(self, ) -> "typing.List[AddressType]": + return _UniffiConverterSequenceTypeAddressType.lift( + _uniffi_rust_call(_UniffiLib.uniffi_ldk_node_fn_method_node_list_monitored_address_types,self._uniffi_clone_pointer(),) + ) + + + + + def list_payments(self, ) -> "typing.List[PaymentDetails]": return _UniffiConverterSequenceTypePaymentDetails.lift( _uniffi_rust_call(_UniffiLib.uniffi_ldk_node_fn_method_node_list_payments,self._uniffi_clone_pointer(),) @@ -6320,6 +6425,8 @@ def list_spendable_outputs(self, ): raise NotImplementedError def new_address(self, ): raise NotImplementedError + def new_address_for_type(self, address_type: "AddressType"): + raise NotImplementedError def select_utxos_with_algorithm(self, target_amount_sats: "int",fee_rate: "typing.Optional[FeeRate]",algorithm: "CoinSelectionAlgorithm",utxos: "typing.Optional[typing.List[SpendableUtxo]]"): raise NotImplementedError def send_all_to_address(self, address: "Address",retain_reserve: "bool",fee_rate: "typing.Optional[FeeRate]"): @@ -6440,6 +6547,18 @@ def new_address(self, ) -> "Address": + def new_address_for_type(self, address_type: "AddressType") -> "Address": + _UniffiConverterTypeAddressType.check_lower(address_type) + + return _UniffiConverterTypeAddress.lift( + _uniffi_rust_call_with_error(_UniffiConverterTypeNodeError,_UniffiLib.uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type,self._uniffi_clone_pointer(), + _UniffiConverterTypeAddressType.lower(address_type)) + ) + + + + + def select_utxos_with_algorithm(self, target_amount_sats: "int",fee_rate: "typing.Optional[FeeRate]",algorithm: "CoinSelectionAlgorithm",utxos: "typing.Optional[typing.List[SpendableUtxo]]") -> "typing.List[SpendableUtxo]": _UniffiConverterUInt64.check_lower(target_amount_sats) @@ -7084,6 +7203,42 @@ def write(cls, value: VssHeaderProviderProtocol, buf: _UniffiRustBuffer): buf.write_u64(cls.lower(value)) +class AddressTypeBalance: + total_sats: "int" + spendable_sats: "int" + def __init__(self, *, total_sats: "int", spendable_sats: "int"): + self.total_sats = total_sats + self.spendable_sats = spendable_sats + + def __str__(self): + return "AddressTypeBalance(total_sats={}, spendable_sats={})".format(self.total_sats, self.spendable_sats) + + def __eq__(self, other): + if self.total_sats != other.total_sats: + return False + if self.spendable_sats != other.spendable_sats: + return False + return True + +class _UniffiConverterTypeAddressTypeBalance(_UniffiConverterRustBuffer): + @staticmethod + def read(buf): + return AddressTypeBalance( + total_sats=_UniffiConverterUInt64.read(buf), + spendable_sats=_UniffiConverterUInt64.read(buf), + ) + + @staticmethod + def check_lower(value): + _UniffiConverterUInt64.check_lower(value.total_sats) + _UniffiConverterUInt64.check_lower(value.spendable_sats) + + @staticmethod + def write(value, buf): + _UniffiConverterUInt64.write(value.total_sats, buf) + _UniffiConverterUInt64.write(value.spendable_sats, buf) + + class AnchorChannelsConfig: trusted_peers_no_reserve: "typing.List[PublicKey]" per_channel_reserve_sats: "int" @@ -7741,7 +7896,9 @@ class Config: anchor_channels_config: "typing.Optional[AnchorChannelsConfig]" route_parameters: "typing.Optional[RouteParametersConfig]" include_untrusted_pending_in_spendable: "bool" - def __init__(self, *, storage_dir_path: "str", network: "Network", listening_addresses: "typing.Optional[typing.List[SocketAddress]]", announcement_addresses: "typing.Optional[typing.List[SocketAddress]]", node_alias: "typing.Optional[NodeAlias]", trusted_peers_0conf: "typing.List[PublicKey]", probing_liquidity_limit_multiplier: "int", anchor_channels_config: "typing.Optional[AnchorChannelsConfig]", route_parameters: "typing.Optional[RouteParametersConfig]", include_untrusted_pending_in_spendable: "bool"): + address_type: "AddressType" + address_types_to_monitor: "typing.List[AddressType]" + def __init__(self, *, storage_dir_path: "str", network: "Network", listening_addresses: "typing.Optional[typing.List[SocketAddress]]", announcement_addresses: "typing.Optional[typing.List[SocketAddress]]", node_alias: "typing.Optional[NodeAlias]", trusted_peers_0conf: "typing.List[PublicKey]", probing_liquidity_limit_multiplier: "int", anchor_channels_config: "typing.Optional[AnchorChannelsConfig]", route_parameters: "typing.Optional[RouteParametersConfig]", include_untrusted_pending_in_spendable: "bool", address_type: "AddressType", address_types_to_monitor: "typing.List[AddressType]"): self.storage_dir_path = storage_dir_path self.network = network self.listening_addresses = listening_addresses @@ -7752,9 +7909,11 @@ def __init__(self, *, storage_dir_path: "str", network: "Network", listening_add self.anchor_channels_config = anchor_channels_config self.route_parameters = route_parameters self.include_untrusted_pending_in_spendable = include_untrusted_pending_in_spendable + self.address_type = address_type + self.address_types_to_monitor = address_types_to_monitor def __str__(self): - return "Config(storage_dir_path={}, network={}, listening_addresses={}, announcement_addresses={}, node_alias={}, trusted_peers_0conf={}, probing_liquidity_limit_multiplier={}, anchor_channels_config={}, route_parameters={}, include_untrusted_pending_in_spendable={})".format(self.storage_dir_path, self.network, self.listening_addresses, self.announcement_addresses, self.node_alias, self.trusted_peers_0conf, self.probing_liquidity_limit_multiplier, self.anchor_channels_config, self.route_parameters, self.include_untrusted_pending_in_spendable) + return "Config(storage_dir_path={}, network={}, listening_addresses={}, announcement_addresses={}, node_alias={}, trusted_peers_0conf={}, probing_liquidity_limit_multiplier={}, anchor_channels_config={}, route_parameters={}, include_untrusted_pending_in_spendable={}, address_type={}, address_types_to_monitor={})".format(self.storage_dir_path, self.network, self.listening_addresses, self.announcement_addresses, self.node_alias, self.trusted_peers_0conf, self.probing_liquidity_limit_multiplier, self.anchor_channels_config, self.route_parameters, self.include_untrusted_pending_in_spendable, self.address_type, self.address_types_to_monitor) def __eq__(self, other): if self.storage_dir_path != other.storage_dir_path: @@ -7777,6 +7936,10 @@ def __eq__(self, other): return False if self.include_untrusted_pending_in_spendable != other.include_untrusted_pending_in_spendable: return False + if self.address_type != other.address_type: + return False + if self.address_types_to_monitor != other.address_types_to_monitor: + return False return True class _UniffiConverterTypeConfig(_UniffiConverterRustBuffer): @@ -7793,6 +7956,8 @@ def read(buf): anchor_channels_config=_UniffiConverterOptionalTypeAnchorChannelsConfig.read(buf), route_parameters=_UniffiConverterOptionalTypeRouteParametersConfig.read(buf), include_untrusted_pending_in_spendable=_UniffiConverterBool.read(buf), + address_type=_UniffiConverterTypeAddressType.read(buf), + address_types_to_monitor=_UniffiConverterSequenceTypeAddressType.read(buf), ) @staticmethod @@ -7807,6 +7972,8 @@ def check_lower(value): _UniffiConverterOptionalTypeAnchorChannelsConfig.check_lower(value.anchor_channels_config) _UniffiConverterOptionalTypeRouteParametersConfig.check_lower(value.route_parameters) _UniffiConverterBool.check_lower(value.include_untrusted_pending_in_spendable) + _UniffiConverterTypeAddressType.check_lower(value.address_type) + _UniffiConverterSequenceTypeAddressType.check_lower(value.address_types_to_monitor) @staticmethod def write(value, buf): @@ -7820,6 +7987,8 @@ def write(value, buf): _UniffiConverterOptionalTypeAnchorChannelsConfig.write(value.anchor_channels_config, buf) _UniffiConverterOptionalTypeRouteParametersConfig.write(value.route_parameters, buf) _UniffiConverterBool.write(value.include_untrusted_pending_in_spendable, buf) + _UniffiConverterTypeAddressType.write(value.address_type, buf) + _UniffiConverterSequenceTypeAddressType.write(value.address_types_to_monitor, buf) class CustomTlvRecord: @@ -9139,6 +9308,60 @@ def write(value, buf): +class AddressType(enum.Enum): + LEGACY = 0 + + NESTED_SEGWIT = 1 + + NATIVE_SEGWIT = 2 + + TAPROOT = 3 + + + +class _UniffiConverterTypeAddressType(_UniffiConverterRustBuffer): + @staticmethod + def read(buf): + variant = buf.read_i32() + if variant == 1: + return AddressType.LEGACY + if variant == 2: + return AddressType.NESTED_SEGWIT + if variant == 3: + return AddressType.NATIVE_SEGWIT + if variant == 4: + return AddressType.TAPROOT + raise InternalError("Raw enum value doesn't match any cases") + + @staticmethod + def check_lower(value): + if value == AddressType.LEGACY: + return + if value == AddressType.NESTED_SEGWIT: + return + if value == AddressType.NATIVE_SEGWIT: + return + if value == AddressType.TAPROOT: + return + raise ValueError(value) + + @staticmethod + def write(value, buf): + if value == AddressType.LEGACY: + buf.write_i32(1) + if value == AddressType.NESTED_SEGWIT: + buf.write_i32(2) + if value == AddressType.NATIVE_SEGWIT: + buf.write_i32(3) + if value == AddressType.TAPROOT: + buf.write_i32(4) + + + + + + + class AsyncPaymentsRole(enum.Enum): CLIENT = 0 @@ -15247,6 +15470,31 @@ def read(cls, buf): +class _UniffiConverterSequenceTypeAddressType(_UniffiConverterRustBuffer): + @classmethod + def check_lower(cls, value): + for item in value: + _UniffiConverterTypeAddressType.check_lower(item) + + @classmethod + def write(cls, value, buf): + items = len(value) + buf.write_i32(items) + for item in value: + _UniffiConverterTypeAddressType.write(item, buf) + + @classmethod + def read(cls, buf): + count = buf.read_i32() + if count < 0: + raise InternalError("Unexpected negative sequence length") + + return [ + _UniffiConverterTypeAddressType.read(buf) for i in range(count) + ] + + + class _UniffiConverterSequenceTypeLightningBalance(_UniffiConverterRustBuffer): @classmethod def check_lower(cls, value): @@ -16033,6 +16281,7 @@ def generate_entropy_mnemonic(word_count: "typing.Optional[WordCount]") -> "Mnem __all__ = [ "InternalError", + "AddressType", "AsyncPaymentsRole", "BalanceSource", "Bolt11InvoiceDescription", @@ -16058,6 +16307,7 @@ def generate_entropy_mnemonic(word_count: "typing.Optional[WordCount]") -> "Mnem "SyncType", "VssHeaderProviderError", "WordCount", + "AddressTypeBalance", "AnchorChannelsConfig", "BackgroundSyncConfig", "BalanceDetails", diff --git a/bindings/swift/Sources/LDKNode/LDKNode.swift b/bindings/swift/Sources/LDKNode/LDKNode.swift index 8dfc0cde6..1b2a1d06c 100644 --- a/bindings/swift/Sources/LDKNode/LDKNode.swift +++ b/bindings/swift/Sources/LDKNode/LDKNode.swift @@ -1562,6 +1562,10 @@ public protocol BuilderProtocol: AnyObject { func buildWithVssStoreAndHeaderProvider(vssUrl: String, storeId: String, headerProvider: VssHeaderProvider) throws -> Node + func setAddressType(addressType: AddressType) + + func setAddressTypesToMonitor(addressTypesToMonitor: [AddressType]) + func setAnnouncementAddresses(announcementAddresses: [SocketAddress]) throws func setAsyncPaymentsRole(role: AsyncPaymentsRole?) throws @@ -1711,6 +1715,18 @@ open class Builder: }) } + open func setAddressType(addressType: AddressType) { try! rustCall { + uniffi_ldk_node_fn_method_builder_set_address_type(self.uniffiClonePointer(), + FfiConverterTypeAddressType.lower(addressType), $0) + } + } + + open func setAddressTypesToMonitor(addressTypesToMonitor: [AddressType]) { try! rustCall { + uniffi_ldk_node_fn_method_builder_set_address_types_to_monitor(self.uniffiClonePointer(), + FfiConverterSequenceTypeAddressType.lower(addressTypesToMonitor), $0) + } + } + open func setAnnouncementAddresses(announcementAddresses: [SocketAddress]) throws { try rustCallWithError(FfiConverterTypeBuildError.lift) { uniffi_ldk_node_fn_method_builder_set_announcement_addresses(self.uniffiClonePointer(), FfiConverterSequenceTypeSocketAddress.lower(announcementAddresses), $0) @@ -2483,12 +2499,16 @@ public protocol NodeProtocol: AnyObject { func getAddressBalance(addressStr: String) throws -> UInt64 + func getBalanceForAddressType(addressType: AddressType) throws -> AddressTypeBalance + func getTransactionDetails(txid: Txid) -> TransactionDetails? func listBalances() -> BalanceDetails func listChannels() -> [ChannelDetails] + func listMonitoredAddressTypes() -> [AddressType] + func listPayments() -> [PaymentDetails] func listPeers() -> [PeerDetails] @@ -2670,6 +2690,13 @@ open class Node: }) } + open func getBalanceForAddressType(addressType: AddressType) throws -> AddressTypeBalance { + return try FfiConverterTypeAddressTypeBalance.lift(rustCallWithError(FfiConverterTypeNodeError.lift) { + uniffi_ldk_node_fn_method_node_get_balance_for_address_type(self.uniffiClonePointer(), + FfiConverterTypeAddressType.lower(addressType), $0) + }) + } + open func getTransactionDetails(txid: Txid) -> TransactionDetails? { return try! FfiConverterOptionTypeTransactionDetails.lift(try! rustCall { uniffi_ldk_node_fn_method_node_get_transaction_details(self.uniffiClonePointer(), @@ -2689,6 +2716,12 @@ open class Node: }) } + open func listMonitoredAddressTypes() -> [AddressType] { + return try! FfiConverterSequenceTypeAddressType.lift(try! rustCall { + uniffi_ldk_node_fn_method_node_list_monitored_address_types(self.uniffiClonePointer(), $0) + }) + } + open func listPayments() -> [PaymentDetails] { return try! FfiConverterSequenceTypePaymentDetails.lift(try! rustCall { uniffi_ldk_node_fn_method_node_list_payments(self.uniffiClonePointer(), $0) @@ -3174,6 +3207,8 @@ public protocol OnchainPaymentProtocol: AnyObject { func newAddress() throws -> Address + func newAddressForType(addressType: AddressType) throws -> Address + func selectUtxosWithAlgorithm(targetAmountSats: UInt64, feeRate: FeeRate?, algorithm: CoinSelectionAlgorithm, utxos: [SpendableUtxo]?) throws -> [SpendableUtxo] func sendAllToAddress(address: Address, retainReserve: Bool, feeRate: FeeRate?) throws -> Txid @@ -3277,6 +3312,13 @@ open class OnchainPayment: }) } + open func newAddressForType(addressType: AddressType) throws -> Address { + return try FfiConverterTypeAddress.lift(rustCallWithError(FfiConverterTypeNodeError.lift) { + uniffi_ldk_node_fn_method_onchainpayment_new_address_for_type(self.uniffiClonePointer(), + FfiConverterTypeAddressType.lower(addressType), $0) + }) + } + open func selectUtxosWithAlgorithm(targetAmountSats: UInt64, feeRate: FeeRate?, algorithm: CoinSelectionAlgorithm, utxos: [SpendableUtxo]?) throws -> [SpendableUtxo] { return try FfiConverterSequenceTypeSpendableUtxo.lift(rustCallWithError(FfiConverterTypeNodeError.lift) { uniffi_ldk_node_fn_method_onchainpayment_select_utxos_with_algorithm(self.uniffiClonePointer(), @@ -3963,6 +4005,67 @@ public func FfiConverterTypeVssHeaderProvider_lower(_ value: VssHeaderProvider) return FfiConverterTypeVssHeaderProvider.lower(value) } +public struct AddressTypeBalance { + public var totalSats: UInt64 + public var spendableSats: UInt64 + + // Default memberwise initializers are never public by default, so we + // declare one manually. + public init(totalSats: UInt64, spendableSats: UInt64) { + self.totalSats = totalSats + self.spendableSats = spendableSats + } +} + +extension AddressTypeBalance: Equatable, Hashable { + public static func == (lhs: AddressTypeBalance, rhs: AddressTypeBalance) -> Bool { + if lhs.totalSats != rhs.totalSats { + return false + } + if lhs.spendableSats != rhs.spendableSats { + return false + } + return true + } + + public func hash(into hasher: inout Hasher) { + hasher.combine(totalSats) + hasher.combine(spendableSats) + } +} + +#if swift(>=5.8) + @_documentation(visibility: private) +#endif +public struct FfiConverterTypeAddressTypeBalance: FfiConverterRustBuffer { + public static func read(from buf: inout (data: Data, offset: Data.Index)) throws -> AddressTypeBalance { + return + try AddressTypeBalance( + totalSats: FfiConverterUInt64.read(from: &buf), + spendableSats: FfiConverterUInt64.read(from: &buf) + ) + } + + public static func write(_ value: AddressTypeBalance, into buf: inout [UInt8]) { + FfiConverterUInt64.write(value.totalSats, into: &buf) + FfiConverterUInt64.write(value.spendableSats, into: &buf) + } +} + +#if swift(>=5.8) + @_documentation(visibility: private) +#endif +public func FfiConverterTypeAddressTypeBalance_lift(_ buf: RustBuffer) throws -> AddressTypeBalance { + return try FfiConverterTypeAddressTypeBalance.lift(buf) +} + +#if swift(>=5.8) + @_documentation(visibility: private) +#endif +public func FfiConverterTypeAddressTypeBalance_lower(_ value: AddressTypeBalance) -> RustBuffer { + return FfiConverterTypeAddressTypeBalance.lower(value) +} + public struct AnchorChannelsConfig { public var trustedPeersNoReserve: [PublicKey] public var perChannelReserveSats: UInt64 @@ -4891,10 +4994,12 @@ public struct Config { public var anchorChannelsConfig: AnchorChannelsConfig? public var routeParameters: RouteParametersConfig? public var includeUntrustedPendingInSpendable: Bool + public var addressType: AddressType + public var addressTypesToMonitor: [AddressType] // Default memberwise initializers are never public by default, so we // declare one manually. - public init(storageDirPath: String, network: Network, listeningAddresses: [SocketAddress]?, announcementAddresses: [SocketAddress]?, nodeAlias: NodeAlias?, trustedPeers0conf: [PublicKey], probingLiquidityLimitMultiplier: UInt64, anchorChannelsConfig: AnchorChannelsConfig?, routeParameters: RouteParametersConfig?, includeUntrustedPendingInSpendable: Bool) { + public init(storageDirPath: String, network: Network, listeningAddresses: [SocketAddress]?, announcementAddresses: [SocketAddress]?, nodeAlias: NodeAlias?, trustedPeers0conf: [PublicKey], probingLiquidityLimitMultiplier: UInt64, anchorChannelsConfig: AnchorChannelsConfig?, routeParameters: RouteParametersConfig?, includeUntrustedPendingInSpendable: Bool, addressType: AddressType, addressTypesToMonitor: [AddressType]) { self.storageDirPath = storageDirPath self.network = network self.listeningAddresses = listeningAddresses @@ -4905,6 +5010,8 @@ public struct Config { self.anchorChannelsConfig = anchorChannelsConfig self.routeParameters = routeParameters self.includeUntrustedPendingInSpendable = includeUntrustedPendingInSpendable + self.addressType = addressType + self.addressTypesToMonitor = addressTypesToMonitor } } @@ -4940,6 +5047,12 @@ extension Config: Equatable, Hashable { if lhs.includeUntrustedPendingInSpendable != rhs.includeUntrustedPendingInSpendable { return false } + if lhs.addressType != rhs.addressType { + return false + } + if lhs.addressTypesToMonitor != rhs.addressTypesToMonitor { + return false + } return true } @@ -4954,6 +5067,8 @@ extension Config: Equatable, Hashable { hasher.combine(anchorChannelsConfig) hasher.combine(routeParameters) hasher.combine(includeUntrustedPendingInSpendable) + hasher.combine(addressType) + hasher.combine(addressTypesToMonitor) } } @@ -4973,7 +5088,9 @@ public struct FfiConverterTypeConfig: FfiConverterRustBuffer { probingLiquidityLimitMultiplier: FfiConverterUInt64.read(from: &buf), anchorChannelsConfig: FfiConverterOptionTypeAnchorChannelsConfig.read(from: &buf), routeParameters: FfiConverterOptionTypeRouteParametersConfig.read(from: &buf), - includeUntrustedPendingInSpendable: FfiConverterBool.read(from: &buf) + includeUntrustedPendingInSpendable: FfiConverterBool.read(from: &buf), + addressType: FfiConverterTypeAddressType.read(from: &buf), + addressTypesToMonitor: FfiConverterSequenceTypeAddressType.read(from: &buf) ) } @@ -4988,6 +5105,8 @@ public struct FfiConverterTypeConfig: FfiConverterRustBuffer { FfiConverterOptionTypeAnchorChannelsConfig.write(value.anchorChannelsConfig, into: &buf) FfiConverterOptionTypeRouteParametersConfig.write(value.routeParameters, into: &buf) FfiConverterBool.write(value.includeUntrustedPendingInSpendable, into: &buf) + FfiConverterTypeAddressType.write(value.addressType, into: &buf) + FfiConverterSequenceTypeAddressType.write(value.addressTypesToMonitor, into: &buf) } } @@ -6914,6 +7033,70 @@ public func FfiConverterTypeTxOutput_lower(_ value: TxOutput) -> RustBuffer { // Note that we don't yet support `indirect` for enums. // See https://github.com/mozilla/uniffi-rs/issues/396 for further discussion. +public enum AddressType { + case legacy + case nestedSegwit + case nativeSegwit + case taproot +} + +#if swift(>=5.8) + @_documentation(visibility: private) +#endif +public struct FfiConverterTypeAddressType: FfiConverterRustBuffer { + typealias SwiftType = AddressType + + public static func read(from buf: inout (data: Data, offset: Data.Index)) throws -> AddressType { + let variant: Int32 = try readInt(&buf) + switch variant { + case 1: return .legacy + + case 2: return .nestedSegwit + + case 3: return .nativeSegwit + + case 4: return .taproot + + default: throw UniffiInternalError.unexpectedEnumCase + } + } + + public static func write(_ value: AddressType, into buf: inout [UInt8]) { + switch value { + case .legacy: + writeInt(&buf, Int32(1)) + + case .nestedSegwit: + writeInt(&buf, Int32(2)) + + case .nativeSegwit: + writeInt(&buf, Int32(3)) + + case .taproot: + writeInt(&buf, Int32(4)) + } + } +} + +#if swift(>=5.8) + @_documentation(visibility: private) +#endif +public func FfiConverterTypeAddressType_lift(_ buf: RustBuffer) throws -> AddressType { + return try FfiConverterTypeAddressType.lift(buf) +} + +#if swift(>=5.8) + @_documentation(visibility: private) +#endif +public func FfiConverterTypeAddressType_lower(_ value: AddressType) -> RustBuffer { + return FfiConverterTypeAddressType.lower(value) +} + +extension AddressType: Equatable, Hashable {} + +// Note that we don't yet support `indirect` for enums. +// See https://github.com/mozilla/uniffi-rs/issues/396 for further discussion. + public enum AsyncPaymentsRole { case client case server @@ -10720,6 +10903,31 @@ private struct FfiConverterSequenceTypeTxOutput: FfiConverterRustBuffer { } } +#if swift(>=5.8) + @_documentation(visibility: private) +#endif +private struct FfiConverterSequenceTypeAddressType: FfiConverterRustBuffer { + typealias SwiftType = [AddressType] + + static func write(_ value: [AddressType], into buf: inout [UInt8]) { + let len = Int32(value.count) + writeInt(&buf, len) + for item in value { + FfiConverterTypeAddressType.write(item, into: &buf) + } + } + + static func read(from buf: inout (data: Data, offset: Data.Index)) throws -> [AddressType] { + let len: Int32 = try readInt(&buf) + var seq = [AddressType]() + seq.reserveCapacity(Int(len)) + for _ in 0 ..< len { + try seq.append(FfiConverterTypeAddressType.read(from: &buf)) + } + return seq + } +} + #if swift(>=5.8) @_documentation(visibility: private) #endif @@ -12038,6 +12246,12 @@ private var initializationResult: InitializationResult = { if uniffi_ldk_node_checksum_method_builder_build_with_vss_store_and_header_provider() != 9090 { return InitializationResult.apiChecksumMismatch } + if uniffi_ldk_node_checksum_method_builder_set_address_type() != 647 { + return InitializationResult.apiChecksumMismatch + } + if uniffi_ldk_node_checksum_method_builder_set_address_types_to_monitor() != 23561 { + return InitializationResult.apiChecksumMismatch + } if uniffi_ldk_node_checksum_method_builder_set_announcement_addresses() != 39271 { return InitializationResult.apiChecksumMismatch } @@ -12170,6 +12384,9 @@ private var initializationResult: InitializationResult = { if uniffi_ldk_node_checksum_method_node_get_address_balance() != 45284 { return InitializationResult.apiChecksumMismatch } + if uniffi_ldk_node_checksum_method_node_get_balance_for_address_type() != 34906 { + return InitializationResult.apiChecksumMismatch + } if uniffi_ldk_node_checksum_method_node_get_transaction_details() != 65000 { return InitializationResult.apiChecksumMismatch } @@ -12179,6 +12396,9 @@ private var initializationResult: InitializationResult = { if uniffi_ldk_node_checksum_method_node_list_channels() != 7954 { return InitializationResult.apiChecksumMismatch } + if uniffi_ldk_node_checksum_method_node_list_monitored_address_types() != 25084 { + return InitializationResult.apiChecksumMismatch + } if uniffi_ldk_node_checksum_method_node_list_payments() != 35002 { return InitializationResult.apiChecksumMismatch } @@ -12314,6 +12534,9 @@ private var initializationResult: InitializationResult = { if uniffi_ldk_node_checksum_method_onchainpayment_new_address() != 37251 { return InitializationResult.apiChecksumMismatch } + if uniffi_ldk_node_checksum_method_onchainpayment_new_address_for_type() != 9083 { + return InitializationResult.apiChecksumMismatch + } if uniffi_ldk_node_checksum_method_onchainpayment_select_utxos_with_algorithm() != 14084 { return InitializationResult.apiChecksumMismatch } diff --git a/src/balance.rs b/src/balance.rs index d96278dae..2f791a78e 100644 --- a/src/balance.rs +++ b/src/balance.rs @@ -13,6 +13,26 @@ use lightning::sign::SpendableOutputDescriptor; use lightning::util::sweep::{OutputSpendStatus, TrackedSpendableOutput}; use lightning_types::payment::{PaymentHash, PaymentPreimage}; +/// Balance details for a specific address type wallet. +/// +/// Returned by [`Node::get_balance_for_address_type`]. +/// +/// [`Node::get_balance_for_address_type`]: crate::Node::get_balance_for_address_type +#[derive(Debug, Clone)] +pub struct AddressTypeBalance { + /// The total balance of the wallet for this address type. + pub total_sats: u64, + /// The currently spendable balance of the wallet for this address type. + /// + /// Note: This does not account for anchor channel reserves. Use + /// [`BalanceDetails::total_anchor_channels_reserve_sats`] from [`Node::list_balances`] + /// to get the aggregate reserve that applies across all wallets. + /// + /// [`BalanceDetails::total_anchor_channels_reserve_sats`]: BalanceDetails::total_anchor_channels_reserve_sats + /// [`Node::list_balances`]: crate::Node::list_balances + pub spendable_sats: u64, +} + /// Details of the known available balances returned by [`Node::list_balances`]. /// /// [`Node::list_balances`]: crate::Node::list_balances diff --git a/src/builder.rs b/src/builder.rs index a85040d8c..59bac5896 100644 --- a/src/builder.rs +++ b/src/builder.rs @@ -14,7 +14,7 @@ use std::sync::{Arc, Mutex, Once, RwLock}; use std::time::SystemTime; use std::{fmt, fs}; -use bdk_wallet::template::Bip84; +use bdk_wallet::template::{Bip44, Bip49, Bip84, Bip86}; use bdk_wallet::{KeychainKind, Wallet as BdkWallet}; use bip39::Mnemonic; use bitcoin::bip32::{ChildNumber, Xpriv}; @@ -43,13 +43,13 @@ use lightning::util::persist::{ }; use lightning::util::ser::{Readable, ReadableArgs}; use lightning::util::sweep::OutputSweeper; -use lightning::{log_info, log_trace}; +use lightning::{log_info, log_trace, log_warn}; use lightning_persister::fs_store::FilesystemStore; use vss_client::headers::{FixedHeaders, LnurlAuthToJwtProvider, VssHeaderProvider}; use crate::chain::ChainSource; use crate::config::{ - default_user_config, may_announce_channel, AnnounceError, AsyncPaymentsRole, + default_user_config, may_announce_channel, AddressType, AnnounceError, AsyncPaymentsRole, BitcoindRestClientConfig, Config, ElectrumSyncConfig, EsploraSyncConfig, RuntimeSyncIntervals, DEFAULT_ESPLORA_SERVER_URL, DEFAULT_LOG_FILENAME, DEFAULT_LOG_LEVEL, WALLET_KEYS_SEED_LEN, }; @@ -92,6 +92,101 @@ const VSS_LNURL_AUTH_HARDENED_CHILD_INDEX: u32 = 138; const LSPS_HARDENED_CHILD_INDEX: u32 = 577; const PERSISTER_MAX_PENDING_UPDATES: u64 = 100; +/// Helper function to load a BDK wallet for a given address type. +/// +/// Returns `Ok(Some(wallet))` if the wallet exists, `Ok(None)` if it needs to be created, +/// or `Err` on failure. +fn load_wallet_for_address_type( + address_type: AddressType, xprv: Xpriv, network: Network, + persister: &mut KVStoreWalletPersister, +) -> Result< + Option>, + bdk_wallet::LoadWithPersistError, +> { + match address_type { + AddressType::Legacy => { + let descriptor = Bip44(xprv, KeychainKind::External); + let change_descriptor = Bip44(xprv, KeychainKind::Internal); + BdkWallet::load() + .descriptor(KeychainKind::External, Some(descriptor)) + .descriptor(KeychainKind::Internal, Some(change_descriptor)) + .extract_keys() + .check_network(network) + .load_wallet(persister) + }, + AddressType::NestedSegwit => { + let descriptor = Bip49(xprv, KeychainKind::External); + let change_descriptor = Bip49(xprv, KeychainKind::Internal); + BdkWallet::load() + .descriptor(KeychainKind::External, Some(descriptor)) + .descriptor(KeychainKind::Internal, Some(change_descriptor)) + .extract_keys() + .check_network(network) + .load_wallet(persister) + }, + AddressType::NativeSegwit => { + let descriptor = Bip84(xprv, KeychainKind::External); + let change_descriptor = Bip84(xprv, KeychainKind::Internal); + BdkWallet::load() + .descriptor(KeychainKind::External, Some(descriptor)) + .descriptor(KeychainKind::Internal, Some(change_descriptor)) + .extract_keys() + .check_network(network) + .load_wallet(persister) + }, + AddressType::Taproot => { + let descriptor = Bip86(xprv, KeychainKind::External); + let change_descriptor = Bip86(xprv, KeychainKind::Internal); + BdkWallet::load() + .descriptor(KeychainKind::External, Some(descriptor)) + .descriptor(KeychainKind::Internal, Some(change_descriptor)) + .extract_keys() + .check_network(network) + .load_wallet(persister) + }, + } +} + +/// Helper function to create a new BDK wallet for a given address type. +fn create_wallet_for_address_type( + address_type: AddressType, xprv: Xpriv, network: Network, + persister: &mut KVStoreWalletPersister, +) -> Result< + bdk_wallet::PersistedWallet, + bdk_wallet::CreateWithPersistError, +> { + match address_type { + AddressType::Legacy => { + let descriptor = Bip44(xprv, KeychainKind::External); + let change_descriptor = Bip44(xprv, KeychainKind::Internal); + BdkWallet::create(descriptor, change_descriptor) + .network(network) + .create_wallet(persister) + }, + AddressType::NestedSegwit => { + let descriptor = Bip49(xprv, KeychainKind::External); + let change_descriptor = Bip49(xprv, KeychainKind::Internal); + BdkWallet::create(descriptor, change_descriptor) + .network(network) + .create_wallet(persister) + }, + AddressType::NativeSegwit => { + let descriptor = Bip84(xprv, KeychainKind::External); + let change_descriptor = Bip84(xprv, KeychainKind::Internal); + BdkWallet::create(descriptor, change_descriptor) + .network(network) + .create_wallet(persister) + }, + AddressType::Taproot => { + let descriptor = Bip86(xprv, KeychainKind::External); + let change_descriptor = Bip86(xprv, KeychainKind::Internal); + BdkWallet::create(descriptor, change_descriptor) + .network(network) + .create_wallet(persister) + }, + } +} + #[derive(Debug, Clone)] enum ChainDataSourceConfig { Esplora { @@ -567,6 +662,37 @@ impl NodeBuilder { self } + /// Sets the address type for the on-chain wallet. + /// + /// This determines what type of addresses will be generated for receiving funds and change outputs: + /// - `Legacy` (P2PKH): BIP 44 - older format, higher fees + /// - `NestedSegwit` (P2SH-wrapped P2WPKH): BIP 49 - compatible with older wallets + /// - `NativeSegwit` (P2WPKH): BIP 84 - default, lower fees, modern standard + /// - `Taproot` (P2TR): BIP 86 - newest format, lowest fees, best privacy + /// + /// **Note:** Lightning channel operations (funding transactions and shutdown scripts) always + /// require witness addresses and will use native segwit regardless of this setting. + /// + /// Default is `NativeSegwit`. + pub fn set_address_type(&mut self, address_type: AddressType) -> &mut Self { + self.config.address_type = address_type; + self + } + + /// Sets additional address types to monitor for existing funds. + /// + /// The `address_type` field determines what type of addresses will be generated for new + /// receiving and change addresses, while `address_types_to_monitor` determines which additional + /// address types should be scanned for existing funds. + /// + /// **Note:** Any duplicates or the primary `address_type` are ignored during wallet setup. + pub fn set_address_types_to_monitor( + &mut self, address_types_to_monitor: Vec, + ) -> &mut Self { + self.config.address_types_to_monitor = address_types_to_monitor; + self + } + /// Sets the IP address and TCP port on which [`Node`] will listen for incoming network connections. pub fn set_listening_addresses( &mut self, listening_addresses: Vec, @@ -1064,6 +1190,18 @@ impl ArcedNodeBuilder { self.inner.write().unwrap().set_network(network); } + /// Sets the address type for the on-chain wallet. + /// + /// See [`NodeBuilder::set_address_type`] for details. + pub fn set_address_type(&self, address_type: AddressType) { + self.inner.write().unwrap().set_address_type(address_type); + } + + /// See [`NodeBuilder::set_address_types_to_monitor`] for details. + pub fn set_address_types_to_monitor(&self, address_types_to_monitor: Vec) { + self.inner.write().unwrap().set_address_types_to_monitor(address_types_to_monitor); + } + /// Sets the IP address and TCP port on which [`Node`] will listen for incoming network connections. pub fn set_listening_addresses( &self, listening_addresses: Vec, @@ -1215,46 +1353,20 @@ fn build_with_store_internal( } } - // Prepare wallet components (instant operations) before parallel VSS reads - let xprv = bitcoin::bip32::Xpriv::new_master(config.network, &seed_bytes).map_err(|e| { - log_error!(logger, "Failed to derive master secret: {}", e); - BuildError::InvalidSeedBytes - })?; - let descriptor = Bip84(xprv, KeychainKind::External); - let change_descriptor = Bip84(xprv, KeychainKind::Internal); - let tx_broadcaster = Arc::new(TransactionBroadcaster::new(Arc::clone(&logger))); let fee_estimator = Arc::new(OnchainFeeEstimator::new()); - // Execute VSS reads in parallel: node_metrics, payments, and wallet - let (node_metrics_result, payments_result, wallet_result) = runtime.block_on(async { + // Execute reads in parallel: node_metrics and payments + let (node_metrics_result, payments_result) = runtime.block_on(async { let metrics_store = Arc::clone(&kv_store); let metrics_logger = Arc::clone(&logger); let payments_store = Arc::clone(&kv_store); let payments_logger = Arc::clone(&logger); - let wallet_store = Arc::clone(&kv_store); - let wallet_logger: Arc = Arc::clone(&logger); - tokio::join!( tokio::task::spawn_blocking(move || read_node_metrics(metrics_store, metrics_logger)), read_payments_async(payments_store, payments_logger), - tokio::task::spawn_blocking({ - let network = config.network; - let descriptor = descriptor.clone(); - let change_descriptor = change_descriptor.clone(); - move || { - let mut persister = KVStoreWalletPersister::new(wallet_store, wallet_logger); - let result = BdkWallet::load() - .descriptor(KeychainKind::External, Some(descriptor)) - .descriptor(KeychainKind::Internal, Some(change_descriptor)) - .extract_keys() - .check_network(network) - .load_wallet(&mut persister); - (result, persister) - } - }) ) }); @@ -1275,6 +1387,12 @@ fn build_with_store_internal( }, }; + // Clear stale timestamps for wallets no longer monitored. + { + let additional_types = config.additional_address_types(); + node_metrics.write().unwrap().retain_wallet_sync_timestamps(&additional_types); + } + // Process payments result let payment_store = match payments_result { Ok(payments) => Arc::new(PaymentStore::new( @@ -1290,33 +1408,6 @@ fn build_with_store_internal( }, }; - // Process wallet result - let (wallet_load_result, mut wallet_persister) = match wallet_result { - Ok(result) => result, - Err(e) => { - log_error!(logger, "Task join error loading wallet: {}", e); - return Err(BuildError::WalletSetupFailed); - }, - }; - let wallet_opt = wallet_load_result.map_err(|e| match e { - bdk_wallet::LoadWithPersistError::InvalidChangeSet(bdk_wallet::LoadError::Mismatch( - bdk_wallet::LoadMismatch::Network { loaded, expected }, - )) => { - log_error!( - logger, - "Failed to setup wallet: Networks do not match. Expected {} but got {}", - expected, - loaded - ); - BuildError::NetworkMismatch - }, - _ => { - log_error!(logger, "Failed to set up wallet: {}", e); - BuildError::WalletSetupFailed - }, - })?; - - // Chain source setup let (chain_source, chain_tip_opt) = match chain_data_source_config { Some(ChainDataSourceConfig::Esplora { server_url, headers, sync_config }) => { let sync_config = sync_config.unwrap_or(EsploraSyncConfig::default()); @@ -1404,17 +1495,55 @@ fn build_with_store_internal( }; let chain_source = Arc::new(chain_source); - // Initialize the on-chain wallet + let xprv = bitcoin::bip32::Xpriv::new_master(config.network, &seed_bytes).map_err(|e| { + log_error!(logger, "Failed to derive master secret: {}", e); + BuildError::InvalidSeedBytes + })?; + + let mut wallet_persister = KVStoreWalletPersister::new( + Arc::clone(&kv_store), + Arc::clone(&logger), + config.address_type, + ); + + let wallet_opt = load_wallet_for_address_type( + config.address_type, + xprv, + config.network, + &mut wallet_persister, + ) + .map_err(|e| match e { + bdk_wallet::LoadWithPersistError::InvalidChangeSet(bdk_wallet::LoadError::Mismatch( + bdk_wallet::LoadMismatch::Network { loaded, expected }, + )) => { + log_error!( + logger, + "Failed to setup wallet: Networks do not match. Expected {} but got {}", + expected, + loaded + ); + BuildError::NetworkMismatch + }, + _ => { + log_error!(logger, "Failed to set up wallet: {}", e); + BuildError::WalletSetupFailed + }, + })?; + let bdk_wallet = match wallet_opt { Some(wallet) => wallet, None => { - let mut wallet = BdkWallet::create(descriptor, change_descriptor) - .network(config.network) - .create_wallet(&mut wallet_persister) - .map_err(|e| { - log_error!(logger, "Failed to set up wallet: {}", e); - BuildError::WalletSetupFailed - })?; + // Create new wallet + let mut wallet = create_wallet_for_address_type( + config.address_type, + xprv, + config.network, + &mut wallet_persister, + ) + .map_err(|e| { + log_error!(logger, "Failed to set up wallet: {}", e); + BuildError::WalletSetupFailed + })?; if let Some(best_block) = chain_tip_opt { // Insert the first checkpoint if we have it, to avoid resyncing from genesis. @@ -1434,14 +1563,112 @@ fn build_with_store_internal( }, }; + let mut additional_wallets = Vec::new(); + for address_type in &config.address_types_to_monitor { + if *address_type == config.address_type { + continue; + } + + match (|| -> Result<_, BuildError> { + let mut additional_persister = KVStoreWalletPersister::new( + Arc::clone(&kv_store), + Arc::clone(&logger), + *address_type, + ); + + let additional_wallet_opt = load_wallet_for_address_type( + *address_type, + xprv, + config.network, + &mut additional_persister, + ) + .map_err(|e| { + log_error!( + logger, + "Failed to load additional wallet for {:?}: {}", + address_type, + e + ); + BuildError::WalletSetupFailed + })?; + + let additional_wallet = match additional_wallet_opt { + Some(wallet) => wallet, + None => { + // Create new wallet + let mut wallet = create_wallet_for_address_type( + *address_type, + xprv, + config.network, + &mut additional_persister, + ) + .map_err(|e| { + log_error!( + logger, + "Failed to create additional wallet for {:?}: {}", + address_type, + e + ); + BuildError::WalletSetupFailed + })?; + + if let Some(best_block) = chain_tip_opt { + let mut latest_checkpoint = wallet.latest_checkpoint(); + let block_id = bdk_chain::BlockId { + height: best_block.height, + hash: best_block.block_hash, + }; + latest_checkpoint = latest_checkpoint.insert(block_id); + let update = bdk_wallet::Update { + chain: Some(latest_checkpoint), + ..Default::default() + }; + wallet.apply_update(update).map_err(|e| { + log_error!( + logger, + "Failed to apply checkpoint for additional wallet {:?}: {}", + address_type, + e + ); + BuildError::WalletSetupFailed + })?; + } + wallet + }, + }; + + Ok((*address_type, additional_wallet, additional_persister)) + })() { + Ok((addr_type, wallet, persister)) => { + additional_wallets.push((addr_type, wallet, persister)); + log_info!( + logger, + "Created additional wallet for monitoring address type: {:?}", + address_type + ); + }, + Err(e) => { + log_warn!( + logger, + "Failed to create additional wallet for {:?}: {}. Continuing with primary wallet only.", + address_type, + e + ); + }, + } + } + let wallet = Arc::new(Wallet::new( bdk_wallet, wallet_persister, + additional_wallets, Arc::clone(&tx_broadcaster), Arc::clone(&fee_estimator), Arc::clone(&payment_store), Arc::clone(&config), Arc::clone(&logger), + Some(Arc::clone(&chain_source)), + Some(Arc::clone(&runtime)), )); // Initialize the KeysManager diff --git a/src/chain/bitcoind.rs b/src/chain/bitcoind.rs index 003c59b7f..6c321a1f6 100644 --- a/src/chain/bitcoind.rs +++ b/src/chain/bitcoind.rs @@ -637,6 +637,15 @@ impl BitcoindChainSource { } } } + + /// Fetches a transaction by its txid from the Bitcoind server. + /// Returns `None` if the transaction is not found. + pub(super) async fn get_transaction(&self, txid: &Txid) -> Result, Error> { + self.api_client.get_raw_transaction(txid).await.map_err(|e| { + log_error!(self.logger, "Failed to fetch transaction {} from Bitcoind: {}", txid, e); + Error::WalletOperationFailed + }) + } } pub enum BitcoindClient { diff --git a/src/chain/electrum.rs b/src/chain/electrum.rs index e524c8a64..966a00564 100644 --- a/src/chain/electrum.rs +++ b/src/chain/electrum.rs @@ -21,6 +21,7 @@ use electrum_client::{ Batch, Client as ElectrumClient, ConfigBuilder as ElectrumConfigBuilder, ElectrumApi, }; use lightning::chain::{Confirm, Filter, WatchedOutput}; +use lightning::log_warn; use lightning::util::ser::Writeable; use lightning_transaction_sync::ElectrumSyncClient; @@ -172,24 +173,148 @@ impl ElectrumChainSource { let cached_txs = onchain_wallet.get_cached_txs(); - let res = if incremental_sync { + let primary_events = if incremental_sync { let incremental_sync_request = onchain_wallet.get_incremental_sync_request(); let incremental_sync_fut = electrum_client - .get_incremental_sync_wallet_update(incremental_sync_request, cached_txs); + .get_incremental_sync_wallet_update(incremental_sync_request, cached_txs.clone()); let now = Instant::now(); let update_res = incremental_sync_fut.await.map(|u| u.into()); - apply_wallet_update(update_res, now) + match apply_wallet_update(update_res, now) { + Ok(events) => events, + Err(Error::WalletOperationTimeout) => { + log_info!(self.logger, "Primary wallet sync timed out after {} seconds. Continuing with monitored wallets...", BDK_WALLET_SYNC_TIMEOUT_SECS); + Vec::new() + }, + Err(e) => { + log_error!(self.logger, "Failed to sync primary wallet: {}", e); + Vec::new() + }, + } } else { let full_scan_request = onchain_wallet.get_full_scan_request(); let full_scan_fut = - electrum_client.get_full_scan_wallet_update(full_scan_request, cached_txs); + electrum_client.get_full_scan_wallet_update(full_scan_request, cached_txs.clone()); let now = Instant::now(); let update_res = full_scan_fut.await.map(|u| u.into()); - apply_wallet_update(update_res, now) + match apply_wallet_update(update_res, now) { + Ok(events) => events, + Err(Error::WalletOperationTimeout) => { + log_info!(self.logger, "Primary wallet sync timed out after {} seconds. Continuing with monitored wallets...", BDK_WALLET_SYNC_TIMEOUT_SECS); + Vec::new() + }, + Err(e) => { + log_error!(self.logger, "Failed to sync primary wallet: {}", e); + Vec::new() + }, + } }; - res + // Sync additional monitored wallets in parallel + let mut all_events = primary_events; + + let sync_requests: Vec<_> = self + .config + .additional_address_types() + .into_iter() + .filter_map(|address_type| { + let do_incremental = self + .node_metrics + .read() + .unwrap() + .get_wallet_sync_timestamp(address_type) + .is_some(); + match onchain_wallet.get_wallet_sync_request(address_type) { + Ok((full_scan_req, incremental_req)) => { + Some((address_type, full_scan_req, incremental_req, do_incremental)) + }, + Err(e) => { + log_info!( + self.logger, + "Skipping sync for wallet {:?}: {}", + address_type, + e + ); + None + }, + } + }) + .collect(); + + let mut join_set = tokio::task::JoinSet::new(); + for (address_type, full_scan_req, incremental_req, do_incremental) in sync_requests { + let client = Arc::clone(&electrum_client); + let txs = cached_txs.clone(); + join_set.spawn(async move { + let result: Result = if do_incremental { + client + .get_incremental_sync_wallet_update(incremental_req, txs) + .await + .map(|u| u.into()) + } else { + client.get_full_scan_wallet_update(full_scan_req, txs).await.map(|u| u.into()) + }; + (address_type, result) + }); + } + + while let Some(join_result) = join_set.join_next().await { + let (address_type, result) = match join_result { + Ok(r) => r, + Err(e) => { + log_warn!(self.logger, "Wallet sync task panicked: {}", e); + continue; + }, + }; + + let wallet_events = match result { + Ok(update) => { + let events = onchain_wallet + .apply_update_to_wallet(address_type, update) + .unwrap_or_else(|e| { + log_warn!( + self.logger, + "Failed to apply update to wallet {:?}: {}", + address_type, + e + ); + Vec::new() + }); + let unix_time_secs_opt = + SystemTime::now().duration_since(UNIX_EPOCH).ok().map(|d| d.as_secs()); + if let Some(ts) = unix_time_secs_opt { + self.node_metrics + .write() + .unwrap() + .set_wallet_sync_timestamp(address_type, ts); + } + events + }, + Err(e) => { + log_warn!(self.logger, "Failed to sync wallet {:?}: {}", address_type, e); + Vec::new() + }, + }; + + all_events.extend(wallet_events); + } + + { + let locked_node_metrics = self.node_metrics.read().unwrap(); + if let Err(e) = write_node_metrics( + &locked_node_metrics, + Arc::clone(&self.kv_store), + Arc::clone(&self.logger), + ) { + log_error!(self.logger, "Failed to persist node metrics: {}", e); + } + } + + if let Err(e) = onchain_wallet.update_payment_store_for_all_transactions() { + log_info!(self.logger, "Failed to update payment store after wallet syncs: {}", e); + } + + Ok(all_events) } pub(crate) async fn sync_lightning_wallet( @@ -327,6 +452,20 @@ impl ElectrumChainSource { }; electrum_client.get_address_balance(address).await } + + /// Fetches a transaction by its txid from the Electrum server. + /// Returns `None` if the transaction is not found. + pub(super) async fn get_transaction(&self, txid: &Txid) -> Result, Error> { + let electrum_client: Arc = + if let Some(client) = self.electrum_runtime_status.read().unwrap().client().as_ref() { + Arc::clone(client) + } else { + log_error!(self.logger, "Electrum client not available"); + return Err(Error::ConnectionFailed); + }; + + electrum_client.get_transaction(txid).await + } } impl Filter for ElectrumChainSource { @@ -473,6 +612,45 @@ impl ElectrumRuntimeClient { } } + /// Fetches a transaction by its txid from the Electrum server. + /// Returns `None` if the transaction is not found. + pub(crate) async fn get_transaction(&self, txid: &Txid) -> Result, Error> { + use electrum_client::ElectrumApi; + + let electrum_client = Arc::clone(&self.electrum_client); + let txid_clone = *txid; + let tx_result = self + .runtime + .spawn_blocking(move || { + electrum_client + .transaction_get(&txid_clone) + .map_err(|e| std::io::Error::new(std::io::ErrorKind::Other, format!("{}", e))) + }) + .await; + + match tx_result { + Ok(Ok(tx)) => Ok(Some(tx)), + Ok(Err(e)) => { + log_error!( + self.logger, + "Failed to fetch transaction {} from Electrum: {}", + txid, + e + ); + Err(Error::WalletOperationFailed) + }, + Err(e) => { + log_error!( + self.logger, + "Failed to fetch transaction {} from Electrum: {}", + txid, + e + ); + Err(Error::WalletOperationFailed) + }, + } + } + async fn sync_confirmables( &self, confirmables: Vec>, ) -> Result<(), Error> { diff --git a/src/chain/esplora.rs b/src/chain/esplora.rs index 2cffaf0bb..e04cb8066 100644 --- a/src/chain/esplora.rs +++ b/src/chain/esplora.rs @@ -14,6 +14,7 @@ use bdk_wallet::event::WalletEvent; use bitcoin::{FeeRate, Network, Script, Transaction, Txid}; use esplora_client::AsyncClient as EsploraAsyncClient; use lightning::chain::{Confirm, Filter, WatchedOutput}; +use lightning::log_warn; use lightning::util::ser::Writeable; use lightning_transaction_sync::EsploraSyncClient; @@ -116,87 +117,35 @@ impl EsploraChainSource { let incremental_sync = self.node_metrics.read().unwrap().latest_onchain_wallet_sync_timestamp.is_some(); - macro_rules! get_and_apply_wallet_update { - ($sync_future: expr) => {{ - let now = Instant::now(); - match $sync_future.await { - Ok(res) => match res { - Ok(update) => match onchain_wallet.apply_update(update) { - Ok(wallet_events) => { - log_info!( - self.logger, - "{} of on-chain wallet finished in {}ms.", - if incremental_sync { "Incremental sync" } else { "Sync" }, - now.elapsed().as_millis() - ); - let unix_time_secs_opt = SystemTime::now() - .duration_since(UNIX_EPOCH) - .ok() - .map(|d| d.as_secs()); - { - let mut locked_node_metrics = self.node_metrics.write().unwrap(); - locked_node_metrics.latest_onchain_wallet_sync_timestamp = unix_time_secs_opt; - write_node_metrics( - &*locked_node_metrics, - Arc::clone(&self.kv_store), - Arc::clone(&self.logger) - )?; - } - Ok(wallet_events) - }, - Err(e) => Err(e), - }, - Err(e) => match *e { - esplora_client::Error::Reqwest(he) => { - if let Some(status_code) = he.status() { - log_error!( - self.logger, - "{} of on-chain wallet failed due to HTTP {} error: {}", - if incremental_sync { "Incremental sync" } else { "Sync" }, - status_code, - he, - ); - } else { - log_error!( - self.logger, - "{} of on-chain wallet failed due to HTTP error: {}", - if incremental_sync { "Incremental sync" } else { "Sync" }, - he, - ); - } - Err(Error::WalletOperationFailed) - }, - _ => { - log_error!( - self.logger, - "{} of on-chain wallet failed due to Esplora error: {}", - if incremental_sync { "Incremental sync" } else { "Sync" }, - e - ); - Err(Error::WalletOperationFailed) - }, - }, - }, - Err(e) => { - log_error!( - self.logger, - "{} of on-chain wallet timed out: {}", - if incremental_sync { "Incremental sync" } else { "Sync" }, - e - ); - Err(Error::WalletOperationTimeout) - }, - } - }} - } - - if incremental_sync { + let primary_events = if incremental_sync { let sync_request = onchain_wallet.get_incremental_sync_request(); let wallet_sync_timeout_fut = tokio::time::timeout( Duration::from_secs(BDK_WALLET_SYNC_TIMEOUT_SECS), self.esplora_client.sync(sync_request, BDK_CLIENT_CONCURRENCY), ); - get_and_apply_wallet_update!(wallet_sync_timeout_fut) + match wallet_sync_timeout_fut.await { + Ok(Ok(update)) => match onchain_wallet.apply_update(update) { + Ok(events) => { + log_info!( + self.logger, + "Incremental sync of primary wallet finished successfully" + ); + events + }, + Err(e) => { + log_error!(self.logger, "Failed to apply update to primary wallet: {}", e); + Vec::new() + }, + }, + Ok(Err(e)) => { + log_error!(self.logger, "Failed to sync primary wallet: {}", e); + Vec::new() + }, + Err(_) => { + log_info!(self.logger, "Primary wallet sync timed out after {} seconds. Continuing with monitored wallets...", BDK_WALLET_SYNC_TIMEOUT_SECS); + Vec::new() + }, + } } else { let full_scan_request = onchain_wallet.get_full_scan_request(); let wallet_sync_timeout_fut = tokio::time::timeout( @@ -207,8 +156,147 @@ impl EsploraChainSource { BDK_CLIENT_CONCURRENCY, ), ); - get_and_apply_wallet_update!(wallet_sync_timeout_fut) + match wallet_sync_timeout_fut.await { + Ok(Ok(update)) => match onchain_wallet.apply_update(update) { + Ok(events) => { + log_info!(self.logger, "Full scan of primary wallet finished successfully"); + events + }, + Err(e) => { + log_error!(self.logger, "Failed to apply update to primary wallet: {}", e); + Vec::new() + }, + }, + Ok(Err(e)) => { + log_error!(self.logger, "Failed to sync primary wallet: {}", e); + Vec::new() + }, + Err(_) => { + log_info!(self.logger, "Primary wallet sync timed out after {} seconds. Continuing with monitored wallets...", BDK_WALLET_SYNC_TIMEOUT_SECS); + Vec::new() + }, + } + }; + + // Sync additional monitored wallets in parallel + let mut all_events = primary_events; + + let sync_requests: Vec<_> = self + .config + .additional_address_types() + .into_iter() + .filter_map(|address_type| { + let do_incremental = self + .node_metrics + .read() + .unwrap() + .get_wallet_sync_timestamp(address_type) + .is_some(); + match onchain_wallet.get_wallet_sync_request(address_type) { + Ok((full_scan_req, incremental_req)) => { + Some((address_type, full_scan_req, incremental_req, do_incremental)) + }, + Err(e) => { + log_info!( + self.logger, + "Skipping sync for wallet {:?}: {}", + address_type, + e + ); + None + }, + } + }) + .collect(); + + let mut join_set = tokio::task::JoinSet::new(); + for (address_type, full_scan_req, incremental_req, do_incremental) in sync_requests { + let client = self.esplora_client.clone(); + join_set.spawn(async move { + let result = if do_incremental { + tokio::time::timeout( + Duration::from_secs(BDK_WALLET_SYNC_TIMEOUT_SECS), + client.sync(incremental_req, BDK_CLIENT_CONCURRENCY), + ) + .await + .map(|r| r.map(|u| bdk_wallet::Update::from(u))) + } else { + tokio::time::timeout( + Duration::from_secs(BDK_WALLET_SYNC_TIMEOUT_SECS), + client.full_scan( + full_scan_req, + BDK_CLIENT_STOP_GAP, + BDK_CLIENT_CONCURRENCY, + ), + ) + .await + .map(|r| r.map(|u| bdk_wallet::Update::from(u))) + }; + (address_type, result) + }); } + + while let Some(join_result) = join_set.join_next().await { + let (address_type, result) = match join_result { + Ok(r) => r, + Err(e) => { + log_warn!(self.logger, "Wallet sync task panicked: {}", e); + continue; + }, + }; + + let wallet_events = match result { + Ok(Ok(update)) => { + let events = onchain_wallet + .apply_update_to_wallet(address_type, update) + .unwrap_or_else(|e| { + log_warn!( + self.logger, + "Failed to apply update to wallet {:?}: {}", + address_type, + e + ); + Vec::new() + }); + let unix_time_secs_opt = + SystemTime::now().duration_since(UNIX_EPOCH).ok().map(|d| d.as_secs()); + if let Some(ts) = unix_time_secs_opt { + self.node_metrics + .write() + .unwrap() + .set_wallet_sync_timestamp(address_type, ts); + } + events + }, + Ok(Err(e)) => { + log_warn!(self.logger, "Failed to sync wallet {:?}: {}", address_type, e); + Vec::new() + }, + Err(_) => { + log_warn!(self.logger, "Sync timeout for wallet {:?}", address_type); + Vec::new() + }, + }; + + all_events.extend(wallet_events); + } + + { + let locked_node_metrics = self.node_metrics.read().unwrap(); + if let Err(e) = write_node_metrics( + &locked_node_metrics, + Arc::clone(&self.kv_store), + Arc::clone(&self.logger), + ) { + log_error!(self.logger, "Failed to persist node metrics: {}", e); + } + } + + if let Err(e) = onchain_wallet.update_payment_store_for_all_transactions() { + log_info!(self.logger, "Failed to update payment store after wallet syncs: {}", e); + } + + Ok(all_events) } pub(super) async fn sync_lightning_wallet( @@ -240,6 +328,13 @@ impl EsploraChainSource { &self, channel_manager: Arc, chain_monitor: Arc, output_sweeper: Arc, ) -> Result<(), Error> { + log_info!( + self.logger, + "Starting Lightning wallet sync (timeout: {} seconds)...", + LDK_WALLET_SYNC_TIMEOUT_SECS + ); + let sync_start = Instant::now(); + let sync_cman = Arc::clone(&channel_manager); let sync_cmon = Arc::clone(&chain_monitor); let sync_sweeper = Arc::clone(&output_sweeper); @@ -249,17 +344,21 @@ impl EsploraChainSource { &*sync_sweeper as &(dyn Confirm + Sync + Send), ]; - let timeout_fut = tokio::time::timeout( - Duration::from_secs(LDK_WALLET_SYNC_TIMEOUT_SECS), - self.tx_sync.sync(confirmables), + log_info!( + self.logger, + "Calling tx_sync.sync() with {} confirmables...", + confirmables.len() ); + let sync_fut = self.tx_sync.sync(confirmables); + let timeout_fut = + tokio::time::timeout(Duration::from_secs(LDK_WALLET_SYNC_TIMEOUT_SECS), sync_fut); let now = Instant::now(); match timeout_fut.await { Ok(res) => match res { Ok(()) => { log_info!( self.logger, - "Sync of Lightning wallet finished in {}ms.", + "Sync of Lightning wallet finished successfully in {}ms.", now.elapsed().as_millis() ); @@ -286,12 +385,43 @@ impl EsploraChainSource { Ok(()) }, Err(e) => { - log_error!(self.logger, "Sync of Lightning wallet failed: {}", e); - Err(e.into()) + let elapsed = sync_start.elapsed(); + let error_string = format!("{}", e); + log_error!( + self.logger, + "Sync of Lightning wallet failed after {:.2} seconds. Error: {}", + elapsed.as_secs_f64(), + error_string + ); + // Check if this is actually a timeout error from the underlying sync + // If so, return TxSyncTimeout instead of TxSyncFailed + if error_string.to_lowercase().contains("timeout") + || error_string.to_lowercase().contains("timed out") + || error_string.contains("Elapsed") + { + log_error!(self.logger, "Underlying sync error appears to be a timeout (error message contains 'timeout'), returning TxSyncTimeout"); + Err(Error::TxSyncTimeout) + } else { + log_error!( + self.logger, + "Underlying sync error is NOT a timeout, returning TxSyncFailed" + ); + Err(e.into()) + } }, }, Err(e) => { - log_error!(self.logger, "Lightning wallet sync timed out: {}", e); + let elapsed = sync_start.elapsed(); + // This is a tokio::time::timeout timeout - the future ran for the full duration + // The error 'e' here is tokio::time::error::Elapsed + log_error!( + self.logger, + "Lightning wallet sync timed out after {:.2} seconds (timeout was {} seconds). This is a tokio::time::timeout timeout (the future ran for the full {} seconds). Error: {}", + elapsed.as_secs_f64(), + LDK_WALLET_SYNC_TIMEOUT_SECS, + LDK_WALLET_SYNC_TIMEOUT_SECS, + e + ); Err(Error::TxSyncTimeout) }, } @@ -468,6 +598,19 @@ impl EsploraChainSource { Err(_) => None, } } + + /// Fetches a transaction by its txid from the Esplora server. + /// Returns `None` if the transaction is not found. + pub(super) async fn get_transaction(&self, txid: &Txid) -> Result, Error> { + match self.esplora_client.get_tx(txid).await { + Ok(Some(tx)) => Ok(Some(tx)), + Ok(None) => Ok(None), + Err(e) => { + log_error!(self.logger, "Failed to fetch transaction {} from Esplora: {}", txid, e); + Err(Error::WalletOperationFailed) + }, + } + } } impl Filter for EsploraChainSource { diff --git a/src/chain/mod.rs b/src/chain/mod.rs index 611017999..3784cf44e 100644 --- a/src/chain/mod.rs +++ b/src/chain/mod.rs @@ -15,7 +15,7 @@ use std::sync::{Arc, Mutex, RwLock}; use std::time::Duration; use bdk_wallet::event::WalletEvent as BdkWalletEvent; -use bitcoin::{Script, Txid}; +use bitcoin::{Script, Transaction, Txid}; use lightning::chain::{BestBlock, Filter}; use lightning_block_sync::gossip::UtxoSource; @@ -162,6 +162,47 @@ fn get_transaction_details( Some(TransactionDetails { amount_sats, inputs, outputs }) } +// Calculate balance details and emit balance update event if changed. +// This is a helper to avoid duplicating the balance calculation/emission logic. +async fn compute_and_emit_balance_update( + wallet: &crate::wallet::Wallet, channel_manager: &Arc, + chain_monitor: &Arc, config: &Arc, + node_metrics: &Arc>, kv_store: &Arc, + event_queue: &EventQueue, logger: &Arc, +) -> Result<(), Error> +where + L2::Target: LdkLogger, +{ + let cur_anchor_reserve_sats = + crate::total_anchor_channels_reserve_sats(channel_manager, config); + let (total_onchain_balance_sats, spendable_onchain_balance_sats) = + wallet.get_balances(cur_anchor_reserve_sats).unwrap_or((0, 0)); + + let mut total_lightning_balance_sats = 0; + for channel_id in chain_monitor.list_monitors() { + if let Ok(monitor) = chain_monitor.get_monitor(channel_id) { + for ldk_balance in monitor.get_claimable_balances() { + total_lightning_balance_sats += ldk_balance.claimable_amount_satoshis(); + } + } + } + + let balance_details = crate::BalanceDetails { + total_onchain_balance_sats, + spendable_onchain_balance_sats, + total_anchor_channels_reserve_sats: std::cmp::min( + cur_anchor_reserve_sats, + total_onchain_balance_sats, + ), + total_lightning_balance_sats, + lightning_balances: Vec::new(), + pending_balances_from_channel_closures: Vec::new(), + }; + + check_and_emit_balance_update(node_metrics, &balance_details, event_queue, kv_store, logger) + .await +} + // Process BDK wallet events and emit corresponding ldk-node events via the event queue. async fn process_wallet_events( wallet_events: Vec, wallet: &crate::wallet::Wallet, @@ -437,6 +478,23 @@ impl ChainSource { } } + /// Update the background sync configuration at runtime. + /// + /// This allows changing sync intervals while the node is running. + /// Returns an error if background syncing was disabled at build time. + pub(crate) fn set_background_sync_config( + &self, config: BackgroundSyncConfig, + ) -> Result<(), Error> { + if let Some(ref sender) = self.sync_config_sender { + // Send will only fail if there are no receivers, which shouldn't happen + // while the sync loop is running + let _ = sender.send(config); + Ok(()) + } else { + Err(Error::BackgroundSyncNotEnabled) + } + } + pub(crate) fn as_utxo_source(&self) -> Option> { match &self.kind { ChainSourceKind::Bitcoind(bitcoind_chain_source) => { @@ -583,6 +641,22 @@ impl ChainSource { } } + fn node_metrics(&self) -> Arc> { + match &self.kind { + ChainSourceKind::Esplora(es) => Arc::clone(&es.node_metrics), + ChainSourceKind::Electrum(el) => Arc::clone(&el.node_metrics), + ChainSourceKind::Bitcoind(bd) => Arc::clone(&bd.node_metrics), + } + } + + fn kv_store(&self) -> Arc { + match &self.kind { + ChainSourceKind::Esplora(es) => Arc::clone(&es.kv_store), + ChainSourceKind::Electrum(el) => Arc::clone(&el.kv_store), + ChainSourceKind::Bitcoind(bd) => Arc::clone(&bd.kv_store), + } + } + async fn start_tx_based_sync_loop( &self, mut stop_sync_receiver: tokio::sync::watch::Receiver<()>, mut config_receiver: tokio::sync::watch::Receiver, @@ -693,23 +767,6 @@ impl ChainSource { *self.event_queue.lock().unwrap() = Some(event_queue); } - /// Update the background sync configuration at runtime. - /// - /// This allows changing sync intervals while the node is running. - /// Returns an error if background syncing was disabled at build time. - pub(crate) fn set_background_sync_config( - &self, config: BackgroundSyncConfig, - ) -> Result<(), Error> { - if let Some(ref sender) = self.sync_config_sender { - // Send will only fail if there are no receivers, which shouldn't happen - // while the sync loop is running - let _ = sender.send(config); - Ok(()) - } else { - Err(Error::BackgroundSyncNotEnabled) - } - } - // Synchronize the onchain wallet via transaction-based protocols (i.e., Esplora, Electrum, // etc.) with event emission support. async fn sync_onchain_wallet_with_events( @@ -720,194 +777,66 @@ impl ChainSource { let wallet = self.onchain_wallet.lock().unwrap().clone(); let wallet = wallet.ok_or(Error::WalletOperationFailed)?; - match &self.kind { - ChainSourceKind::Esplora(esplora_chain_source) => { - // Track unconfirmed transactions before sync to detect evictions - let prev_unconfirmed_txids = wallet.get_unconfirmed_txids(); - - let wallet_events = - esplora_chain_source.sync_onchain_wallet(Arc::clone(&wallet)).await?; - - // Process wallet events if event queue is provided - if let Some(event_queue) = event_queue { - process_wallet_events( - wallet_events, - &wallet, - event_queue, - &self.logger, - channel_manager, - chain_monitor, - ) - .await?; - - // Check for evicted transactions - check_and_emit_evicted_transactions( - prev_unconfirmed_txids, - &wallet, - event_queue, - &self.logger, - ) - .await; - - // Emit SyncCompleted event - let synced_height = wallet.current_best_block().height; - event_queue - .add_event(Event::SyncCompleted { - sync_type: SyncType::OnchainWallet, - synced_block_height: synced_height, - }) - .await?; - // Check for balance changes and emit BalanceChanged event if needed - if let (Some(cm), Some(chain_mon), Some(cfg)) = - (channel_manager, chain_monitor, config) - { - let cur_anchor_reserve_sats = - crate::total_anchor_channels_reserve_sats(cm, cfg); - let (total_onchain_balance_sats, spendable_onchain_balance_sats) = - wallet.get_balances(cur_anchor_reserve_sats).unwrap_or((0, 0)); - - let mut total_lightning_balance_sats = 0; - for channel_id in chain_mon.list_monitors() { - if let Ok(monitor) = chain_mon.get_monitor(channel_id) { - for ldk_balance in monitor.get_claimable_balances() { - total_lightning_balance_sats += - ldk_balance.claimable_amount_satoshis(); - } - } - } + let prev_unconfirmed_txids = wallet.get_unconfirmed_txids(); - let balance_details = crate::BalanceDetails { - total_onchain_balance_sats, - spendable_onchain_balance_sats, - total_anchor_channels_reserve_sats: std::cmp::min( - cur_anchor_reserve_sats, - total_onchain_balance_sats, - ), - total_lightning_balance_sats, - lightning_balances: Vec::new(), - pending_balances_from_channel_closures: Vec::new(), - }; - - let node_metrics = match &self.kind { - ChainSourceKind::Esplora(es) => Arc::clone(&es.node_metrics), - ChainSourceKind::Electrum(el) => Arc::clone(&el.node_metrics), - ChainSourceKind::Bitcoind(bd) => Arc::clone(&bd.node_metrics), - }; - let kv_store = match &self.kind { - ChainSourceKind::Esplora(es) => Arc::clone(&es.kv_store), - ChainSourceKind::Electrum(el) => Arc::clone(&el.kv_store), - ChainSourceKind::Bitcoind(bd) => Arc::clone(&bd.kv_store), - }; - - check_and_emit_balance_update( - &node_metrics, - &balance_details, - event_queue, - &kv_store, - &self.logger, - ) - .await?; - } - } - Ok(()) + let wallet_events = match &self.kind { + ChainSourceKind::Esplora(esplora_chain_source) => { + esplora_chain_source.sync_onchain_wallet(Arc::clone(&wallet)).await? }, ChainSourceKind::Electrum(electrum_chain_source) => { - // Track unconfirmed transactions before sync to detect evictions - let prev_unconfirmed_txids = wallet.get_unconfirmed_txids(); - - let wallet_events = - electrum_chain_source.sync_onchain_wallet(Arc::clone(&wallet)).await?; - - // Process wallet events if event queue is provided - if let Some(event_queue) = event_queue { - process_wallet_events( - wallet_events, - &wallet, - event_queue, - &self.logger, - channel_manager, - chain_monitor, - ) - .await?; - - // Check for evicted transactions - check_and_emit_evicted_transactions( - prev_unconfirmed_txids, - &wallet, - event_queue, - &self.logger, - ) - .await; - - // Emit SyncCompleted event - let synced_height = wallet.current_best_block().height; - event_queue - .add_event(Event::SyncCompleted { - sync_type: SyncType::OnchainWallet, - synced_block_height: synced_height, - }) - .await?; - - // Check for balance changes and emit BalanceChanged event if needed - if let (Some(cm), Some(chain_mon), Some(cfg)) = - (channel_manager, chain_monitor, config) - { - let cur_anchor_reserve_sats = - crate::total_anchor_channels_reserve_sats(cm, cfg); - let (total_onchain_balance_sats, spendable_onchain_balance_sats) = - wallet.get_balances(cur_anchor_reserve_sats).unwrap_or((0, 0)); - - let mut total_lightning_balance_sats = 0; - for channel_id in chain_mon.list_monitors() { - if let Ok(monitor) = chain_mon.get_monitor(channel_id) { - for ldk_balance in monitor.get_claimable_balances() { - total_lightning_balance_sats += - ldk_balance.claimable_amount_satoshis(); - } - } - } - - let balance_details = crate::BalanceDetails { - total_onchain_balance_sats, - spendable_onchain_balance_sats, - total_anchor_channels_reserve_sats: std::cmp::min( - cur_anchor_reserve_sats, - total_onchain_balance_sats, - ), - total_lightning_balance_sats, - lightning_balances: Vec::new(), - pending_balances_from_channel_closures: Vec::new(), - }; - - let node_metrics = match &self.kind { - ChainSourceKind::Esplora(es) => Arc::clone(&es.node_metrics), - ChainSourceKind::Electrum(el) => Arc::clone(&el.node_metrics), - ChainSourceKind::Bitcoind(bd) => Arc::clone(&bd.node_metrics), - }; - let kv_store = match &self.kind { - ChainSourceKind::Esplora(es) => Arc::clone(&es.kv_store), - ChainSourceKind::Electrum(el) => Arc::clone(&el.kv_store), - ChainSourceKind::Bitcoind(bd) => Arc::clone(&bd.kv_store), - }; - - check_and_emit_balance_update( - &node_metrics, - &balance_details, - event_queue, - &kv_store, - &self.logger, - ) - .await?; - } - } - Ok(()) + electrum_chain_source.sync_onchain_wallet(Arc::clone(&wallet)).await? }, ChainSourceKind::Bitcoind { .. } => { // In BitcoindRpc mode we sync lightning and onchain wallet in one go via // `ChainPoller`. So nothing to do here. unreachable!("Onchain wallet will be synced via chain polling") }, + }; + + if let Some(event_queue) = event_queue { + process_wallet_events( + wallet_events, + &wallet, + event_queue, + &self.logger, + channel_manager, + chain_monitor, + ) + .await?; + + check_and_emit_evicted_transactions( + prev_unconfirmed_txids, + &wallet, + event_queue, + &self.logger, + ) + .await; + + let synced_height = wallet.current_best_block().height; + event_queue + .add_event(Event::SyncCompleted { + sync_type: SyncType::OnchainWallet, + synced_block_height: synced_height, + }) + .await?; + + // Check for balance changes and emit BalanceChanged event if needed + if let (Some(cm), Some(chain_mon), Some(cfg)) = (channel_manager, chain_monitor, config) + { + compute_and_emit_balance_update( + &wallet, + cm, + chain_mon, + cfg, + &self.node_metrics(), + &self.kv_store(), + event_queue, + &self.logger, + ) + .await?; + } } + Ok(()) } // Synchronize the Lightning wallet via transaction-based protocols (i.e., Esplora, Electrum, @@ -1049,6 +978,22 @@ impl ChainSource { }, } } + + /// Fetches a transaction by its txid from the chain source. + /// Returns `None` if the transaction is not found. + pub(crate) async fn get_transaction(&self, txid: &Txid) -> Result, Error> { + match &self.kind { + ChainSourceKind::Esplora(esplora_chain_source) => { + esplora_chain_source.get_transaction(txid).await + }, + ChainSourceKind::Electrum(electrum_chain_source) => { + electrum_chain_source.get_transaction(txid).await + }, + ChainSourceKind::Bitcoind(bitcoind_chain_source) => { + bitcoind_chain_source.get_transaction(txid).await + }, + } + } } fn periodically_archive_fully_resolved_monitors( diff --git a/src/config.rs b/src/config.rs index fd162dcb5..3df4bccb1 100644 --- a/src/config.rs +++ b/src/config.rs @@ -99,6 +99,74 @@ pub const WALLET_KEYS_SEED_LEN: usize = 64; // The timeout after which we abort a external scores sync operation. pub(crate) const EXTERNAL_PATHFINDING_SCORES_SYNC_TIMEOUT_SECS: u64 = 5; +/// Supported Bitcoin address types for the on-chain wallet. +/// +/// This determines what type of addresses will be generated for receiving funds and change outputs. +/// Note that Lightning channel operations (funding, shutdown scripts) always require witness addresses. +#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)] +pub enum AddressType { + /// Legacy addresses (P2PKH) - BIP 44 + /// Example: 1A1zP1eP5QGefi2DMPTfTL5SLmv7DivfNa + Legacy, + /// Nested SegWit addresses (P2SH-wrapped P2WPKH) - BIP 49 + /// Example: 3J98t1WpEZ73CNmQviecrnyiWrnqRhWNLy + NestedSegwit, + /// Native SegWit addresses (P2WPKH) - BIP 84 + /// Example: bc1qw508d6qejxtdg4y5r3zarvary0c5xw7kv8f3t4 + NativeSegwit, + /// Taproot addresses (P2TR) - BIP 86 + /// Example: bc1p5d7rjq7g6rdk2yhzks9smlaqtedr4dekq08ge8ztwac72sfr9rusxg3297 + Taproot, +} + +impl Default for AddressType { + fn default() -> Self { + AddressType::NativeSegwit + } +} + +impl lightning::util::ser::Writeable for AddressType { + fn write( + &self, writer: &mut W, + ) -> Result<(), bitcoin::io::Error> { + match self { + AddressType::Legacy => 0u8.write(writer), + AddressType::NestedSegwit => 1u8.write(writer), + AddressType::NativeSegwit => 2u8.write(writer), + AddressType::Taproot => 3u8.write(writer), + } + } +} + +impl lightning::util::ser::Readable for AddressType { + fn read( + reader: &mut R, + ) -> Result { + let discriminant: u8 = lightning::util::ser::Readable::read(reader)?; + match discriminant { + 0 => Ok(AddressType::Legacy), + 1 => Ok(AddressType::NestedSegwit), + 2 => Ok(AddressType::NativeSegwit), + 3 => Ok(AddressType::Taproot), + _ => Err(lightning::ln::msgs::DecodeError::InvalidValue), + } + } +} + +impl Config { + /// Returns the additional address types to monitor, excluding the primary address type. + /// + /// This filters `address_types_to_monitor` to remove any duplicates of the primary `address_type`, + /// since those would be redundant to monitor separately. + pub fn additional_address_types(&self) -> Vec { + self.address_types_to_monitor + .iter() + .copied() + .filter(|&at| at != self.address_type) + .collect() + } +} + #[derive(Debug, Clone)] /// Represents the configuration of an [`Node`] instance. /// @@ -197,6 +265,31 @@ pub struct Config { /// /// [`BalanceDetails::spendable_onchain_balance_sats`]: crate::BalanceDetails::spendable_onchain_balance_sats pub include_untrusted_pending_in_spendable: bool, + /// The address type to use for the on-chain wallet (receiving funds and change addresses). + /// + /// This determines what type of addresses will be generated: + /// - `Legacy` (P2PKH): BIP 44 - older format, higher fees + /// - `NestedSegwit` (P2SH-wrapped P2WPKH): BIP 49 - compatible with older wallets + /// - `NativeSegwit` (P2WPKH): BIP 84 - default, lower fees, modern standard + /// - `Taproot` (P2TR): BIP 86 - newest format, lowest fees, best privacy + /// + /// **Note:** Lightning channel operations (funding transactions and shutdown scripts) always + /// require witness addresses and will use native segwit regardless of this setting. + /// + /// Default is `NativeSegwit`. + pub address_type: AddressType, + /// Additional address types to monitor for existing funds. + /// + /// This allows tracking funds in multiple address types simultaneously. For example, if you + /// previously used Legacy addresses but now want to use Native Segwit for new addresses, you + /// can add `Legacy` here to continue monitoring your old Legacy addresses. + /// + /// The `address_type` field determines what type of addresses will be generated for new + /// receiving and change addresses, while `address_types_to_monitor` determines which additional + /// address types should be scanned for existing funds. + /// + /// Default is empty (only the primary `address_type` is monitored). + pub address_types_to_monitor: Vec, } impl Default for Config { @@ -212,6 +305,8 @@ impl Default for Config { route_parameters: None, node_alias: None, include_untrusted_pending_in_spendable: false, + address_type: AddressType::default(), + address_types_to_monitor: Vec::new(), } } } diff --git a/src/ffi/types.rs b/src/ffi/types.rs index 90b29d70b..020f5bd22 100644 --- a/src/ffi/types.rs +++ b/src/ffi/types.rs @@ -42,13 +42,16 @@ pub use lightning_types::payment::{PaymentHash, PaymentPreimage, PaymentSecret}; pub use lightning_types::string::UntrustedString; pub use vss_client::headers::{VssHeaderProvider, VssHeaderProviderError}; +pub use crate::balance::AddressTypeBalance; use crate::builder::sanitize_alias; pub use crate::config::{ - default_config, AnchorChannelsConfig, BackgroundSyncConfig, ElectrumSyncConfig, - EsploraSyncConfig, MaxDustHTLCExposure, + battery_saving_sync_intervals, default_config, AddressType, AnchorChannelsConfig, + BackgroundSyncConfig, ElectrumSyncConfig, EsploraSyncConfig, MaxDustHTLCExposure, + RuntimeSyncIntervals, }; use crate::error::Error; pub use crate::graph::{ChannelInfo, ChannelUpdateInfo, NodeAnnouncementInfo, NodeInfo}; +pub use crate::io::utils::derive_node_secret_from_mnemonic; pub use crate::liquidity::{LSPS1OrderStatus, LSPS2ServiceConfig}; pub use crate::logger::{LogLevel, LogRecord, LogWriter}; pub use crate::payment::store::{ diff --git a/src/io/utils.rs b/src/io/utils.rs index b616b79cf..7d1e126a0 100644 --- a/src/io/utils.rs +++ b/src/io/utils.rs @@ -421,13 +421,24 @@ macro_rules! impl_read_write_change_set_type { $key:expr ) => { pub(crate) fn $read_name( - kv_store: Arc, logger: L, + kv_store: Arc, logger: L, address_type: crate::config::AddressType, ) -> Result, std::io::Error> where L::Target: LdkLogger, { + let address_type_suffix = match address_type { + crate::config::AddressType::Legacy => "legacy", + crate::config::AddressType::NestedSegwit => "nested_segwit", + crate::config::AddressType::NativeSegwit => "native_segwit", + crate::config::AddressType::Taproot => "taproot", + }; + let secondary_namespace = if $secondary_namespace.is_empty() { + address_type_suffix.to_string() + } else { + format!("{}/{}", $secondary_namespace, address_type_suffix) + }; let bytes = - match KVStoreSync::read(&*kv_store, $primary_namespace, $secondary_namespace, $key) + match KVStoreSync::read(&*kv_store, $primary_namespace, &secondary_namespace, $key) { Ok(bytes) => bytes, Err(e) => { @@ -438,7 +449,7 @@ macro_rules! impl_read_write_change_set_type { logger, "Reading data from key {}/{}/{} failed due to: {}", $primary_namespace, - $secondary_namespace, + secondary_namespace, $key, e ); @@ -464,18 +475,30 @@ macro_rules! impl_read_write_change_set_type { pub(crate) fn $write_name( value: &$change_set_type, kv_store: Arc, logger: L, + address_type: crate::config::AddressType, ) -> Result<(), std::io::Error> where L::Target: LdkLogger, { + let address_type_suffix = match address_type { + crate::config::AddressType::Legacy => "legacy", + crate::config::AddressType::NestedSegwit => "nested_segwit", + crate::config::AddressType::NativeSegwit => "native_segwit", + crate::config::AddressType::Taproot => "taproot", + }; + let secondary_namespace = if $secondary_namespace.is_empty() { + address_type_suffix.to_string() + } else { + format!("{}/{}", $secondary_namespace, address_type_suffix) + }; let data = ChangeSetSerWrapper(value).encode(); - KVStoreSync::write(&*kv_store, $primary_namespace, $secondary_namespace, $key, data) + KVStoreSync::write(&*kv_store, $primary_namespace, &secondary_namespace, $key, data) .map_err(|e| { log_error!( logger, "Writing data to key {}/{}/{} failed due to: {}", $primary_namespace, - $secondary_namespace, + secondary_namespace, $key, e ); @@ -541,13 +564,13 @@ impl_read_write_change_set_type!( // Reads the full BdkWalletChangeSet or returns default fields pub(crate) fn read_bdk_wallet_change_set( - kv_store: Arc, logger: Arc, + kv_store: Arc, logger: Arc, address_type: crate::config::AddressType, ) -> Result, std::io::Error> { let mut change_set = BdkWalletChangeSet::default(); // We require a descriptor and return `None` to signal creation of a new wallet otherwise. if let Some(descriptor) = - read_bdk_wallet_descriptor(Arc::clone(&kv_store), Arc::clone(&logger))? + read_bdk_wallet_descriptor(Arc::clone(&kv_store), Arc::clone(&logger), address_type)? { change_set.descriptor = Some(descriptor); } else { @@ -556,7 +579,7 @@ pub(crate) fn read_bdk_wallet_change_set( // We require a change_descriptor and return `None` to signal creation of a new wallet otherwise. if let Some(change_descriptor) = - read_bdk_wallet_change_descriptor(Arc::clone(&kv_store), Arc::clone(&logger))? + read_bdk_wallet_change_descriptor(Arc::clone(&kv_store), Arc::clone(&logger), address_type)? { change_set.change_descriptor = Some(change_descriptor); } else { @@ -564,17 +587,19 @@ pub(crate) fn read_bdk_wallet_change_set( } // We require a network and return `None` to signal creation of a new wallet otherwise. - if let Some(network) = read_bdk_wallet_network(Arc::clone(&kv_store), Arc::clone(&logger))? { + if let Some(network) = + read_bdk_wallet_network(Arc::clone(&kv_store), Arc::clone(&logger), address_type)? + { change_set.network = Some(network); } else { return Ok(None); } - read_bdk_wallet_local_chain(Arc::clone(&kv_store), Arc::clone(&logger))? + read_bdk_wallet_local_chain(Arc::clone(&kv_store), Arc::clone(&logger), address_type)? .map(|local_chain| change_set.local_chain = local_chain); - read_bdk_wallet_tx_graph(Arc::clone(&kv_store), Arc::clone(&logger))? + read_bdk_wallet_tx_graph(Arc::clone(&kv_store), Arc::clone(&logger), address_type)? .map(|tx_graph| change_set.tx_graph = tx_graph); - read_bdk_wallet_indexer(Arc::clone(&kv_store), Arc::clone(&logger))? + read_bdk_wallet_indexer(Arc::clone(&kv_store), Arc::clone(&logger), address_type)? .map(|indexer| change_set.indexer = indexer); Ok(Some(change_set)) } diff --git a/src/lib.rs b/src/lib.rs index 4b94448fa..80779f6e5 100644 --- a/src/lib.rs +++ b/src/lib.rs @@ -110,7 +110,7 @@ use std::sync::atomic::AtomicU32; use std::sync::{Arc, Mutex, RwLock}; use std::time::{Duration, Instant, SystemTime, UNIX_EPOCH}; -pub use balance::{BalanceDetails, LightningBalance, PendingSweepBalance}; +pub use balance::{AddressTypeBalance, BalanceDetails, LightningBalance, PendingSweepBalance}; use bitcoin::secp256k1::PublicKey; use bitcoin::{Address, Amount}; #[cfg(feature = "uniffi")] @@ -121,8 +121,9 @@ pub use builder::{BuildError, ChannelDataMigration}; use chain::ChainSource; pub use config::{battery_saving_sync_intervals, RuntimeSyncIntervals}; use config::{ - default_user_config, may_announce_channel, AsyncPaymentsRole, BackgroundSyncConfig, - ChannelConfig, Config, NODE_ANN_BCAST_INTERVAL, PEER_RECONNECTION_INTERVAL, RGS_SYNC_INTERVAL, + default_user_config, may_announce_channel, AddressType, AsyncPaymentsRole, + BackgroundSyncConfig, ChannelConfig, Config, NODE_ANN_BCAST_INTERVAL, + PEER_RECONNECTION_INTERVAL, RGS_SYNC_INTERVAL, }; use connection::ConnectionManager; pub use error::Error as NodeError; @@ -1749,6 +1750,31 @@ impl Node { } } + /// Retrieves the on-chain balance for a specific address type. + /// + /// Returns an error if the address type is not loaded (i.e., not the primary type + /// and not in `address_types_to_monitor`). + /// + /// Note: The returned `spendable_sats` does not account for anchor channel reserves. + /// Use [`list_balances`] to get the aggregate balance with reserves applied. + /// + /// [`list_balances`]: Self::list_balances + pub fn get_balance_for_address_type( + &self, address_type: AddressType, + ) -> Result { + let (total_sats, spendable_sats) = + self.wallet.get_balance_for_address_type(address_type)?; + Ok(AddressTypeBalance { total_sats, spendable_sats }) + } + + /// Returns all address types currently loaded and monitored by the wallet. + /// + /// This includes the primary address type and all types specified in + /// `address_types_to_monitor` during configuration. + pub fn list_monitored_address_types(&self) -> Vec { + self.wallet.get_loaded_address_types() + } + /// Retrieves all payments that match the given predicate. /// /// For example, you could retrieve all stored outbound payments as follows: @@ -1924,6 +1950,7 @@ pub(crate) struct NodeMetrics { last_known_spendable_onchain_balance_sats: Option, last_known_total_onchain_balance_sats: Option, last_known_total_lightning_balance_sats: Option, + monitored_wallet_sync_timestamps: Vec<(AddressType, u64)>, } impl Default for NodeMetrics { @@ -1939,8 +1966,32 @@ impl Default for NodeMetrics { last_known_spendable_onchain_balance_sats: None, last_known_total_onchain_balance_sats: None, last_known_total_lightning_balance_sats: None, + monitored_wallet_sync_timestamps: Vec::new(), + } + } +} + +impl NodeMetrics { + pub(crate) fn get_wallet_sync_timestamp(&self, address_type: AddressType) -> Option { + self.monitored_wallet_sync_timestamps + .iter() + .find(|(at, _)| *at == address_type) + .map(|(_, ts)| *ts) + } + + pub(crate) fn set_wallet_sync_timestamp(&mut self, address_type: AddressType, timestamp: u64) { + if let Some(entry) = + self.monitored_wallet_sync_timestamps.iter_mut().find(|(at, _)| *at == address_type) + { + entry.1 = timestamp; + } else { + self.monitored_wallet_sync_timestamps.push((address_type, timestamp)); } } + + pub(crate) fn retain_wallet_sync_timestamps(&mut self, address_types: &[AddressType]) { + self.monitored_wallet_sync_timestamps.retain(|(at, _)| address_types.contains(at)); + } } impl_writeable_tlv_based!(NodeMetrics, { @@ -1954,6 +2005,7 @@ impl_writeable_tlv_based!(NodeMetrics, { (12, last_known_spendable_onchain_balance_sats, option), (14, last_known_total_onchain_balance_sats, option), (16, last_known_total_lightning_balance_sats, option), + (18, monitored_wallet_sync_timestamps, optional_vec), }); // Check if balances have changed and emit BalanceChanged event if so. diff --git a/src/payment/onchain.rs b/src/payment/onchain.rs index 1b4bd18dc..3bbeb13c6 100644 --- a/src/payment/onchain.rs +++ b/src/payment/onchain.rs @@ -11,7 +11,7 @@ use std::sync::{Arc, RwLock}; use bitcoin::{Address, Txid}; -use crate::config::Config; +use crate::config::{AddressType, Config}; use crate::error::Error; use crate::fee_estimator::ConfirmationTarget; use crate::logger::{log_info, LdkLogger, Logger}; @@ -64,6 +64,25 @@ impl OnchainPayment { Ok(funding_address) } + /// Retrieve a new on-chain address for a specific address type. + /// + /// The address type must be configured as the primary address type or included in + /// `address_types_to_monitor` when building the node. + pub fn new_address_for_type(&self, address_type: AddressType) -> Result { + if !*self.is_running.read().unwrap() { + return Err(Error::NotRunning); + } + + let funding_address = self.wallet.get_new_address_for_type(address_type)?; + log_info!( + self.logger, + "Generated new funding address for {:?}: {}", + address_type, + funding_address + ); + Ok(funding_address) + } + /// Returns a list of all UTXOs that are safe to spend. /// /// This excludes any outputs that are currently being used to fund Lightning channels. @@ -133,7 +152,7 @@ impl OnchainPayment { fee_rate, algorithm, &drain_script, - &self.channel_manager, + Some(&self.channel_manager), )?; // Convert selected outpoints back to SpendableUtxo by direct filtering diff --git a/src/wallet/mod.rs b/src/wallet/mod.rs index de8701b72..2a44fb8fc 100644 --- a/src/wallet/mod.rs +++ b/src/wallet/mod.rs @@ -5,11 +5,12 @@ // http://opensource.org/licenses/MIT>, at your option. You may not use this file except in // accordance with one or both of these licenses. +use std::collections::HashMap; use std::future::Future; use std::ops::Deref; use std::pin::Pin; use std::str::FromStr; -use std::sync::{Arc, Mutex, MutexGuard}; +use std::sync::{Arc, Mutex}; use bdk_chain::spk_client::{FullScanRequest, SyncRequest}; pub use bdk_wallet::coin_selection::CoinSelectionAlgorithm as BdkCoinSelectionAlgorithm; @@ -17,7 +18,6 @@ use bdk_wallet::coin_selection::{ BranchAndBoundCoinSelection, Excess, LargestFirstCoinSelection, OldestFirstCoinSelection, SingleRandomDraw, }; -use bdk_wallet::descriptor::ExtendedDescriptor; use bdk_wallet::event::WalletEvent; #[allow(deprecated)] use bdk_wallet::SignOptions; @@ -54,7 +54,7 @@ use lightning::util::message_signing; use lightning_invoice::RawBolt11Invoice; use persist::KVStoreWalletPersister; -use crate::config::Config; +use crate::config::{AddressType, Config}; use crate::event::{TxInput, TxOutput}; use crate::fee_estimator::{ConfirmationTarget, FeeEstimator, OnchainFeeEstimator}; use crate::logger::{log_debug, log_error, log_info, log_trace, LdkLogger, Logger}; @@ -89,27 +89,402 @@ pub enum CoinSelectionAlgorithm { pub(crate) mod persist; pub(crate) mod ser; +/// UTXO info for multi-wallet PSBT building. +#[derive(Clone)] +pub(crate) struct UtxoPsbtInfo { + pub outpoint: OutPoint, + pub psbt_input: psbt::Input, + pub weight: Weight, + pub is_primary: bool, +} + pub(crate) struct Wallet { - // A BDK on-chain wallet. - inner: Mutex>, - persister: Mutex, + // Multiple BDK on-chain wallets, one per address type being monitored. + // The primary wallet (for generating new addresses) is stored under config.address_type. + wallets: Mutex>>, + persisters: Mutex>, broadcaster: Arc, fee_estimator: Arc, payment_store: Arc, config: Arc, logger: Arc, + // Optional chain source and runtime for fetching transactions when not in wallet history + chain_source: Option>, + runtime: Option>, } impl Wallet { pub(crate) fn new( - wallet: bdk_wallet::PersistedWallet, - wallet_persister: KVStoreWalletPersister, broadcaster: Arc, - fee_estimator: Arc, payment_store: Arc, - config: Arc, logger: Arc, + primary_wallet: bdk_wallet::PersistedWallet, + primary_persister: KVStoreWalletPersister, + additional_wallets: Vec<( + AddressType, + bdk_wallet::PersistedWallet, + KVStoreWalletPersister, + )>, + broadcaster: Arc, fee_estimator: Arc, + payment_store: Arc, config: Arc, logger: Arc, + chain_source: Option>, + runtime: Option>, ) -> Self { - let inner = Mutex::new(wallet); - let persister = Mutex::new(wallet_persister); - Self { inner, persister, broadcaster, fee_estimator, payment_store, config, logger } + let mut wallets = HashMap::new(); + let mut persisters = HashMap::new(); + + // Add primary wallet + wallets.insert(config.address_type, primary_wallet); + persisters.insert(config.address_type, primary_persister); + + // Add additional wallets for monitoring + for (address_type, wallet, persister) in additional_wallets { + wallets.insert(address_type, wallet); + persisters.insert(address_type, persister); + } + + Self { + wallets: Mutex::new(wallets), + persisters: Mutex::new(persisters), + broadcaster, + fee_estimator, + payment_store, + config, + logger, + chain_source, + runtime, + } + } + + fn with_wallet_mut(&self, address_type: AddressType, f: F) -> Result + where + F: FnOnce( + &mut PersistedWallet, + &mut KVStoreWalletPersister, + ) -> Result, + { + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + + let wallet = wallets.get_mut(&address_type).ok_or_else(|| { + log_error!(self.logger, "Wallet not found for address type {:?}", address_type); + Error::WalletOperationFailed + })?; + + let persister = persisters.get_mut(&address_type).ok_or_else(|| { + log_error!(self.logger, "Persister not found for address type {:?}", address_type); + Error::WalletOperationFailed + })?; + + f(wallet, persister) + } + + fn with_primary_wallet_mut(&self, f: F) -> Result + where + F: FnOnce( + &mut PersistedWallet, + &mut KVStoreWalletPersister, + ) -> Result, + { + self.with_wallet_mut(self.config.address_type, f) + } + + fn collect_from_wallets(&self, f: F) -> Vec + where + F: Fn(&PersistedWallet) -> Vec, + { + let wallets = self.wallets.lock().unwrap(); + wallets.values().flat_map(|w| f(w)).collect() + } + + fn calculate_utxo_weight(script_pubkey: &ScriptBuf) -> Weight { + match script_pubkey.witness_version() { + Some(bitcoin::WitnessVersion::V0) => Weight::from_wu(272), // P2WPKH + Some(bitcoin::WitnessVersion::V1) => Weight::from_wu(230), // P2TR + None => { + // Check if P2SH-wrapped SegWit (nested segwit) + let script_bytes = script_pubkey.as_bytes(); + if script_bytes.len() == 23 && script_bytes[0] == 0xa9 && script_bytes[21] == 0x87 { + Weight::from_wu(360) // P2SH-wrapped P2WPKH + } else { + Weight::from_wu(588) // P2PKH (legacy) + } + }, + _ => Weight::from_wu(272), // Fallback to P2WPKH weight + } + } + + fn find_wallet_for_tx( + wallets: &HashMap>, txid: Txid, + ) -> Option { + wallets.iter().find_map(|(addr_type, wallet)| wallet.get_tx(txid).map(|_| *addr_type)) + } + + fn find_tx_in_wallets( + wallets: &HashMap>, txid: Txid, + ) -> Option { + for wallet in wallets.values() { + if let Some(tx_node) = wallet.get_tx(txid) { + return Some((*tx_node.tx_node.tx).clone()); + } + } + None + } + + fn get_aggregate_balance(&self) -> Balance { + let wallets = self.wallets.lock().unwrap(); + Self::get_aggregate_balance_from_wallets(&wallets) + } + + fn get_aggregate_balance_from_wallets( + wallets: &HashMap>, + ) -> Balance { + let mut total = Balance::default(); + for wallet in wallets.values() { + let balance = wallet.balance(); + total.confirmed += balance.confirmed; + total.trusted_pending += balance.trusted_pending; + total.untrusted_pending += balance.untrusted_pending; + } + total + } + + fn calculate_fee_from_psbt( + &self, psbt: &Psbt, wallets: &HashMap>, + ) -> Result { + let mut total_input_value = 0u64; + + for (i, txin) in psbt.unsigned_tx.input.iter().enumerate() { + if let Some(psbt_input) = psbt.inputs.get(i) { + if let Some(witness_utxo) = &psbt_input.witness_utxo { + total_input_value += witness_utxo.value.to_sat(); + } else if let Some(non_witness_tx) = &psbt_input.non_witness_utxo { + if let Some(txout) = + non_witness_tx.output.get(txin.previous_output.vout as usize) + { + total_input_value += txout.value.to_sat(); + } else { + log_error!( + self.logger, + "Could not find output {} in non_witness_utxo for input {:?}", + txin.previous_output.vout, + txin.previous_output + ); + return Err(Error::OnchainTxCreationFailed); + } + } else { + // Fallback: try to find the UTXO in any wallet + let mut found = false; + for wallet in wallets.values() { + if let Some(local_utxo) = wallet.get_utxo(txin.previous_output) { + total_input_value += local_utxo.txout.value.to_sat(); + found = true; + break; + } + } + if !found { + log_error!( + self.logger, + "Could not find TxOut for input {:?} in PSBT or any wallet", + txin.previous_output + ); + return Err(Error::OnchainTxCreationFailed); + } + } + } else { + // PSBT input not found, try to find in wallets + let mut found = false; + for wallet in wallets.values() { + if let Some(local_utxo) = wallet.get_utxo(txin.previous_output) { + total_input_value += local_utxo.txout.value.to_sat(); + found = true; + break; + } + } + if !found { + log_error!( + self.logger, + "Could not find TxOut for input {:?} in PSBT or any wallet", + txin.previous_output + ); + return Err(Error::OnchainTxCreationFailed); + } + } + } + + let total_output_value: u64 = + psbt.unsigned_tx.output.iter().map(|txout| txout.value.to_sat()).sum(); + + Ok(total_input_value.saturating_sub(total_output_value)) + } + + fn prepare_utxos_for_psbt( + &self, utxos: &[LocalOutput], + wallets: &HashMap>, + ) -> Result, Error> { + let mut result = Vec::new(); + + for utxo in utxos { + let mut found_wallet: Option<(AddressType, &PersistedWallet)> = + None; + for (addr_type, w) in wallets.iter() { + if w.get_utxo(utxo.outpoint).is_some() { + found_wallet = Some((*addr_type, w)); + break; + } + } + + let (addr_type, wallet) = match found_wallet { + Some(w) => w, + None => { + log_error!(self.logger, "UTXO {:?} not found in any wallet", utxo.outpoint); + return Err(Error::OnchainTxCreationFailed); + }, + }; + + let local_utxo = match wallet.get_utxo(utxo.outpoint) { + Some(u) => u, + None => { + log_error!(self.logger, "UTXO {:?} disappeared from wallet", utxo.outpoint); + return Err(Error::OnchainTxCreationFailed); + }, + }; + + let mut psbt_input = match wallet.get_psbt_input(local_utxo.clone(), None, true) { + Ok(input) => input, + Err(e) => { + log_error!( + self.logger, + "Failed to get PSBT input for UTXO {:?}: {}", + utxo.outpoint, + e + ); + return Err(Error::OnchainTxCreationFailed); + }, + }; + + let is_primary = addr_type == self.config.address_type; + let weight = Self::calculate_utxo_weight(&local_utxo.txout.script_pubkey); + + if !is_primary { + // BDK requires non_witness_utxo for foreign UTXOs + if psbt_input.non_witness_utxo.is_none() { + let found_tx = Self::find_tx_in_wallets(wallets, utxo.outpoint.txid); + + if let Some(tx) = found_tx { + psbt_input.non_witness_utxo = Some(tx); + log_debug!( + self.logger, + "Set non_witness_utxo for foreign UTXO {:?} from wallet", + utxo.outpoint + ); + } else { + log_info!( + self.logger, + "Transaction {:?} not found in any wallet, attempting chain source...", + utxo.outpoint.txid + ); + if let (Some(chain_source), Some(runtime)) = + (&self.chain_source, &self.runtime) + { + match runtime + .block_on(chain_source.get_transaction(&utxo.outpoint.txid)) + { + Ok(Some(tx)) => { + psbt_input.non_witness_utxo = Some(tx.clone()); + log_info!( + self.logger, + "Fetched transaction {:?} from chain source", + utxo.outpoint.txid + ); + }, + Ok(None) => { + log_error!( + self.logger, + "Transaction {:?} not found in chain source", + utxo.outpoint.txid + ); + return Err(Error::OnchainTxCreationFailed); + }, + Err(e) => { + log_error!( + self.logger, + "Failed to fetch transaction {:?}: {}", + utxo.outpoint.txid, + e + ); + return Err(Error::OnchainTxCreationFailed); + }, + } + } else { + log_error!( + self.logger, + "Cannot get transaction for foreign UTXO {:?}: no chain source", + utxo.outpoint + ); + return Err(Error::OnchainTxCreationFailed); + } + } + } + if psbt_input.witness_utxo.is_none() { + psbt_input.witness_utxo = Some(local_utxo.txout.clone()); + } + } else if psbt_input.witness_utxo.is_none() { + psbt_input.witness_utxo = Some(local_utxo.txout.clone()); + } + + result.push(UtxoPsbtInfo { outpoint: utxo.outpoint, psbt_input, weight, is_primary }); + } + + Ok(result) + } + + fn prepare_outpoints_for_psbt( + &self, outpoints: &[OutPoint], + wallets: &HashMap>, + ) -> Result, Error> { + let mut utxos = Vec::new(); + for outpoint in outpoints { + let mut found = false; + for wallet in wallets.values() { + if let Some(local_utxo) = wallet.get_utxo(*outpoint) { + utxos.push(local_utxo); + found = true; + break; + } + } + if !found { + log_error!(self.logger, "UTXO {:?} not found in any wallet", outpoint); + return Err(Error::OnchainTxCreationFailed); + } + } + self.prepare_utxos_for_psbt(&utxos, wallets) + } + + fn add_utxos_to_tx_builder( + &self, tx_builder: &mut bdk_wallet::TxBuilder<'_, Cs>, utxo_infos: &[UtxoPsbtInfo], + ) -> Result<(), Error> + where + Cs: bdk_wallet::coin_selection::CoinSelectionAlgorithm, + { + for info in utxo_infos { + if info.is_primary { + tx_builder.add_utxo(info.outpoint).map_err(|e| { + log_error!(self.logger, "Failed to add UTXO {:?}: {}", info.outpoint, e); + Error::OnchainTxCreationFailed + })?; + } else { + tx_builder + .add_foreign_utxo(info.outpoint, info.psbt_input.clone(), info.weight) + .map_err(|e| { + log_error!( + self.logger, + "Failed to add foreign UTXO {:?}: {}", + info.outpoint, + e + ); + Error::OnchainTxCreationFailed + })?; + } + } + Ok(()) } pub(crate) fn is_funding_transaction( @@ -137,92 +512,226 @@ impl Wallet { } pub(crate) fn get_full_scan_request(&self) -> FullScanRequest { - self.inner.lock().unwrap().start_full_scan().build() + let wallets = self.wallets.lock().unwrap(); + wallets.get(&self.config.address_type).unwrap().start_full_scan().build() } pub(crate) fn get_incremental_sync_request(&self) -> SyncRequest<(KeychainKind, u32)> { - self.inner.lock().unwrap().start_sync_with_revealed_spks().build() + let wallets = self.wallets.lock().unwrap(); + wallets.get(&self.config.address_type).unwrap().start_sync_with_revealed_spks().build() + } + + pub(crate) fn apply_update_to_wallet( + &self, address_type: AddressType, update: impl Into, + ) -> Result, Error> { + let update = update.into(); + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + + let events_result = if let (Some(wallet), Some(persister)) = + (wallets.get_mut(&address_type), persisters.get_mut(&address_type)) + { + match wallet.apply_update_events(update) { + Ok(events) => { + wallet.persist(persister).map_err(|e| { + log_error!( + self.logger, + "Failed to persist wallet for {:?}: {}", + address_type, + e + ); + Error::PersistenceFailed + })?; + + let txids: Vec = + wallet.transactions().map(|wtx| wtx.tx_node.txid).collect(); + Ok((events, txids)) + }, + Err(e) => { + log_error!( + self.logger, + "Failed to apply update to wallet {:?}: {}", + address_type, + e + ); + Err(Error::WalletOperationFailed) + }, + } + } else { + log_error!( + self.logger, + "Wallet or persister not found for address type {:?}", + address_type + ); + Err(Error::WalletOperationFailed) + }; + + drop(wallets); + drop(persisters); + + match events_result { + Ok((events, txids)) => { + if let Err(e) = self.update_payment_store_for_txids(txids) { + log_error!( + self.logger, + "Failed to update payment store for wallet {:?}: {}", + address_type, + e + ); + } + Ok(events) + }, + Err(e) => Err(e), + } + } + + pub(crate) fn get_wallet_sync_request( + &self, address_type: AddressType, + ) -> Result<(FullScanRequest, SyncRequest<(KeychainKind, u32)>), Error> { + let wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get(&address_type).ok_or_else(|| { + let loaded_types: Vec<_> = wallets.keys().copied().collect(); + log_error!( + self.logger, + "Wallet not found for address type {:?}. Expected in address_types_to_monitor: {:?}, Primary: {:?}, Loaded wallets: {:?}", + address_type, + self.config.address_types_to_monitor, + self.config.address_type, + loaded_types + ); + Error::WalletOperationFailed + })?; + + let full_scan = wallet.start_full_scan().build(); + let incremental_sync = wallet.start_sync_with_revealed_spks().build(); + Ok((full_scan, incremental_sync)) } pub(crate) fn get_cached_txs(&self) -> Vec> { - self.inner.lock().unwrap().tx_graph().full_txs().map(|tx_node| tx_node.tx).collect() + self.collect_from_wallets(|wallet| { + wallet.tx_graph().full_txs().map(|tx_node| tx_node.tx).collect() + }) } pub(crate) fn get_unconfirmed_txids(&self) -> Vec { - self.inner - .lock() - .unwrap() - .transactions() - .filter(|t| t.chain_position.is_unconfirmed()) - .map(|t| t.tx_node.txid) - .collect() + self.collect_from_wallets(|wallet| { + wallet + .transactions() + .filter(|t| t.chain_position.is_unconfirmed()) + .map(|t| t.tx_node.txid) + .collect() + }) } pub(crate) fn is_tx_confirmed(&self, txid: &Txid) -> bool { - self.inner - .lock() - .unwrap() - .get_tx(*txid) - .map(|tx_node| tx_node.chain_position.is_confirmed()) - .unwrap_or(false) + // Check all wallets + let wallets = self.wallets.lock().unwrap(); + for wallet in wallets.values() { + if let Some(tx_node) = wallet.get_tx(*txid) { + if tx_node.chain_position.is_confirmed() { + return true; + } + } + } + false } pub(crate) fn current_best_block(&self) -> BestBlock { - let checkpoint = self.inner.lock().unwrap().latest_checkpoint(); + // Use primary wallet's checkpoint + let wallets = self.wallets.lock().unwrap(); + let checkpoint = wallets.get(&self.config.address_type).unwrap().latest_checkpoint(); BestBlock { block_hash: checkpoint.hash(), height: checkpoint.height() } } // Get a drain script for change outputs. pub(crate) fn get_drain_script(&self) -> Result { - let locked_wallet = self.inner.lock().unwrap(); - let change_address = locked_wallet.peek_address(KeychainKind::Internal, 0); + // Use primary wallet for change addresses + let wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get(&self.config.address_type).unwrap(); + let change_address = wallet.peek_address(KeychainKind::Internal, 0); Ok(change_address.address.script_pubkey()) } pub(crate) fn apply_update( &self, update: impl Into, ) -> Result, Error> { - let mut locked_wallet = self.inner.lock().unwrap(); - match locked_wallet.apply_update_events(update) { - Ok(events) => { - let mut locked_persister = self.persister.lock().unwrap(); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; + let update = update.into(); + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + let mut all_events = Vec::new(); + + let (primary_events_result, all_txids) = + if let Some(wallet) = wallets.get_mut(&self.config.address_type) { + let events_result = match wallet.apply_update_events(update) { + Ok(events) => { + if let Some(persister) = persisters.get_mut(&self.config.address_type) { + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet: {}", e); + Error::PersistenceFailed + })?; + } + Ok(events) + }, + Err(e) => { + log_error!(self.logger, "Sync failed due to chain connection error: {}", e); + Err(Error::WalletOperationFailed) + }, + }; - self.update_payment_store(&mut *locked_wallet).map_err(|e| { - log_error!(self.logger, "Failed to update payment store: {}", e); - Error::PersistenceFailed - })?; + let mut all_txids_set = std::collections::HashSet::new(); + for wallet in wallets.values() { + for wtx in wallet.transactions() { + all_txids_set.insert(wtx.tx_node.txid); + } + } + (events_result, all_txids_set.into_iter().collect()) + } else { + (Err(Error::WalletOperationFailed), Vec::new()) + }; - Ok(events) - }, - Err(e) => { - log_error!(self.logger, "Sync failed due to chain connection error: {}", e); - Err(Error::WalletOperationFailed) + drop(wallets); + drop(persisters); + + match primary_events_result { + Ok(events) => { + all_events.extend(events); + if !all_txids.is_empty() { + self.update_payment_store_for_txids(all_txids).map_err(|e| { + log_error!(self.logger, "Failed to update payment store: {}", e); + Error::PersistenceFailed + })?; + } }, + Err(e) => return Err(e), } + + Ok(all_events) } pub(crate) fn apply_mempool_txs( &self, unconfirmed_txs: Vec<(Transaction, u64)>, evicted_txids: Vec<(Txid, u64)>, ) -> Result<(), Error> { - let mut locked_wallet = self.inner.lock().unwrap(); - locked_wallet.apply_unconfirmed_txs(unconfirmed_txs); - locked_wallet.apply_evicted_txs(evicted_txids); + // Apply mempool updates to all wallets + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); - let mut locked_persister = self.persister.lock().unwrap(); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; + for (address_type, wallet) in wallets.iter_mut() { + wallet.apply_unconfirmed_txs(unconfirmed_txs.clone()); + wallet.apply_evicted_txs(evicted_txids.clone()); + + if let Some(persister) = persisters.get_mut(address_type) { + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet {:?}: {}", address_type, e); + Error::PersistenceFailed + })?; + } + } Ok(()) } // Bumps the fee of an existing transaction using Replace-By-Fee (RBF). - // Returns the txid of the new transaction if successful. + // Supports both single-wallet and cross-wallet transactions. pub(crate) fn bump_fee_by_rbf( &self, txid: &Txid, fee_rate: FeeRate, channel_manager: &ChannelManager, ) -> Result { @@ -235,29 +744,65 @@ impl Wallet { ); return Err(Error::CannotRbfFundingTransaction); } - let mut locked_wallet = self.inner.lock().unwrap(); - // Find the transaction in the wallet - let tx_node = locked_wallet.get_tx(*txid).ok_or_else(|| { - log_error!(self.logger, "Transaction not found in wallet: {}", txid); + // Find which wallet contains this transaction + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + + // First, find the transaction to get its inputs + let tx = Self::find_tx_in_wallets(&wallets, *txid).ok_or_else(|| { + log_error!(self.logger, "Transaction not found in any wallet: {}", txid); Error::TransactionNotFound })?; - // Check if transaction is confirmed - can't replace confirmed transactions - if tx_node.chain_position.is_confirmed() { - log_error!(self.logger, "Cannot replace confirmed transaction: {}", txid); - return Err(Error::TransactionAlreadyConfirmed); + // Check if this is a cross-wallet transaction by seeing which wallets own inputs + let mut wallets_with_inputs: Vec = Vec::new(); + for (addr_type, wallet) in wallets.iter() { + for input in &tx.input { + if let Some(prev_tx_node) = wallet.get_tx(input.previous_output.txid) { + // This wallet has the previous tx, check if the output belongs to this wallet + if let Some(prev_output) = + prev_tx_node.tx_node.tx.output.get(input.previous_output.vout as usize) + { + if wallet.is_mine(prev_output.script_pubkey.clone()) { + if !wallets_with_inputs.contains(addr_type) { + wallets_with_inputs.push(*addr_type); + } + break; + } + } + } + } } - // Calculate original transaction fee and fee rate - let original_tx = &tx_node.tx_node.tx; - let original_fee = locked_wallet.calculate_fee(original_tx).map_err(|e| { - log_error!(self.logger, "Failed to calculate original fee: {}", e); - Error::WalletOperationFailed + let is_cross_wallet = wallets_with_inputs.len() > 1; + + // Get transaction details from any wallet that has it + let first_wallet_type = Self::find_wallet_for_tx(&wallets, *txid).ok_or_else(|| { + log_error!(self.logger, "Transaction not found in any wallet: {}", txid); + Error::TransactionNotFound })?; - // Use Bitcoin crate's built-in fee rate calculation for accuracy - let original_fee_rate = original_fee / original_tx.weight(); + let (original_tx, original_fee, original_fee_rate, is_confirmed) = { + let wallet = wallets.get(&first_wallet_type).unwrap(); + let tx_node = wallet.get_tx(*txid).unwrap(); + + let is_confirmed = tx_node.chain_position.is_confirmed(); + let original_tx = (*tx_node.tx_node.tx).clone(); + let original_fee = wallet.calculate_fee(&original_tx).map_err(|e| { + log_error!(self.logger, "Failed to calculate original fee: {}", e); + Error::WalletOperationFailed + })?; + let original_fee_rate = original_fee / original_tx.weight(); + + (original_tx, original_fee, original_fee_rate, is_confirmed) + }; + + // Check if transaction is confirmed - can't replace confirmed transactions + if is_confirmed { + log_error!(self.logger, "Cannot replace confirmed transaction: {}", txid); + return Err(Error::TransactionAlreadyConfirmed); + } // Log detailed information for debugging log_info!(self.logger, "RBF Analysis for transaction {}", txid); @@ -280,9 +825,15 @@ impl Wallet { fee_rate.to_sat_per_kwu(), fee_rate.to_sat_per_vb_ceil() ); + if is_cross_wallet { + log_info!( + self.logger, + " Cross-wallet transaction: inputs from {:?}", + wallets_with_inputs + ); + } // Essential validation: new fee rate must be higher than original - // This prevents definite rejections by the Bitcoin network if fee_rate <= original_fee_rate { log_error!( self.logger, @@ -300,114 +851,449 @@ impl Wallet { fee_rate.to_sat_per_vb_ceil() ); - // Build a new transaction with higher fee using BDK's fee bump functionality - let mut tx_builder = locked_wallet.build_fee_bump(*txid).map_err(|e| { - log_error!(self.logger, "Failed to create fee bump builder: {}", e); - Error::OnchainTxCreationFailed - })?; + let replacement_tx = if is_cross_wallet { + // For cross-wallet transactions, we need to build the RBF manually + // because BDK's build_fee_bump only operates on a single wallet + self.build_cross_wallet_rbf( + &wallets, + &mut persisters, + &original_tx, + fee_rate, + original_fee, + )? + } else { + // For single-wallet transactions, use BDK's build_fee_bump + let address_type = first_wallet_type; + let wallet = wallets.get_mut(&address_type).unwrap(); + let persister = persisters.get_mut(&address_type).unwrap(); + + let mut tx_builder = wallet.build_fee_bump(*txid).map_err(|e| { + log_error!(self.logger, "Failed to create fee bump builder: {}", e); + Error::OnchainTxCreationFailed + })?; - // Set the new fee rate - tx_builder.fee_rate(fee_rate); + tx_builder.fee_rate(fee_rate); - // Finalize the transaction - let mut psbt = match tx_builder.finish() { - Ok(psbt) => { - log_trace!(self.logger, "Created RBF PSBT: {:?}", psbt); - psbt - }, - Err(err) => { - log_error!(self.logger, "Failed to create RBF transaction: {}", err); - return Err(Error::OnchainTxCreationFailed); - }, - }; + let mut psbt = match tx_builder.finish() { + Ok(psbt) => { + log_trace!(self.logger, "Created RBF PSBT: {:?}", psbt); + psbt + }, + Err(err) => { + log_error!(self.logger, "Failed to create RBF transaction: {}", err); + return Err(Error::OnchainTxCreationFailed); + }, + }; - // Sign the transaction - match locked_wallet.sign(&mut psbt, SignOptions::default()) { - Ok(finalized) => { - if !finalized { - log_error!(self.logger, "Failed to finalize RBF transaction"); + match wallet.sign(&mut psbt, SignOptions::default()) { + Ok(finalized) => { + if !finalized { + log_error!(self.logger, "Failed to finalize RBF transaction"); + return Err(Error::OnchainTxSigningFailed); + } + }, + Err(err) => { + log_error!(self.logger, "Failed to sign RBF transaction: {}", err); return Err(Error::OnchainTxSigningFailed); + }, + } + + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet: {}", e); + Error::PersistenceFailed + })?; + + psbt.extract_tx().map_err(|e| { + log_error!(self.logger, "Failed to extract transaction: {}", e); + Error::OnchainTxCreationFailed + })? + }; + + self.broadcaster.broadcast_transactions(&[&replacement_tx]); + + let new_txid = replacement_tx.compute_txid(); + + // Calculate and log the actual fee increase achieved + let new_weight = replacement_tx.weight(); + let new_fee_sats = fee_rate.to_sat_per_kwu() as u64 * new_weight.to_wu() / 1000; + let actual_fee_rate = FeeRate::from_sat_per_kwu(new_fee_sats * 1000 / new_weight.to_wu()); + + log_info!(self.logger, "RBF transaction created successfully!"); + log_info!( + self.logger, + " Original: {} ({} sat/vB, {} sats fee)", + txid, + original_fee_rate.to_sat_per_vb_ceil(), + original_fee.to_sat() + ); + log_info!( + self.logger, + " Replacement: {} (~{} sat/vB)", + new_txid, + actual_fee_rate.to_sat_per_vb_ceil() + ); + + Ok(new_txid) + } + + // Builds an RBF transaction for cross-wallet transactions. + // Reduces change output or adds new inputs as needed to cover the higher fee. + fn build_cross_wallet_rbf( + &self, wallets: &HashMap>, + _persisters: &mut HashMap, original_tx: &Transaction, + new_fee_rate: FeeRate, original_fee: Amount, + ) -> Result { + // BIP 125 check: Verify original transaction signals RBF + // At least one input must have nSequence < 0xfffffffe + let signals_rbf = original_tx.input.iter().any(|input| input.sequence.0 < 0xfffffffe); + + if !signals_rbf { + log_error!( + self.logger, + "Original transaction does not signal RBF (no input has nSequence < 0xfffffffe)" + ); + return Err(Error::OnchainTxCreationFailed); + } + + // Calculate the new fee needed (rough estimate) + let tx_weight = original_tx.weight(); + let new_fee_sats = new_fee_rate.to_sat_per_kwu() as u64 * tx_weight.to_wu() / 1000; + + // BIP 125 check: Replacement must pay higher ABSOLUTE fee + if new_fee_sats <= original_fee.to_sat() { + log_error!( + self.logger, + "BIP 125 violation: Replacement fee ({} sats) must be higher than original ({} sats)", + new_fee_sats, + original_fee.to_sat() + ); + return Err(Error::InvalidFeeRate); + } + + let additional_fee_needed = new_fee_sats - original_fee.to_sat(); + + log_info!( + self.logger, + "Cross-wallet RBF: need ~{} additional sats for fee bump", + additional_fee_needed + ); + + // Identify the change output (belongs to a wallet we control) + // and the recipient outputs (external addresses) + let primary_wallet = wallets.get(&self.config.address_type).ok_or_else(|| { + log_error!(self.logger, "Primary wallet not found"); + Error::WalletOperationFailed + })?; + + let mut recipient_outputs: Vec = Vec::new(); + let mut change_outputs: Vec = Vec::new(); + + for output in &original_tx.output { + // OP_RETURN outputs (data carriers) should be preserved as-is + if output.script_pubkey.is_op_return() { + recipient_outputs.push(output.clone()); + continue; + } + + // Check if this output belongs to any of our wallets + let is_ours = wallets.values().any(|w| w.is_mine(output.script_pubkey.clone())); + + if is_ours { + // This is a change output (could be multiple in edge cases) + change_outputs.push(output.clone()); + log_info!( + self.logger, + "Found change output with value {} sats", + output.value.to_sat() + ); + } else { + // This is a recipient output - we must preserve it exactly + recipient_outputs.push(output.clone()); + } + } + + // Calculate total change available + let total_change_value: u64 = change_outputs.iter().map(|o| o.value.to_sat()).sum(); + let has_change = !change_outputs.is_empty(); + + // First approach: Try to just reduce the change output(s) + if has_change && total_change_value > additional_fee_needed + 546 { + let new_change_value = total_change_value - additional_fee_needed; + + log_info!( + self.logger, + "Cross-wallet RBF: reducing change from {} to {} sats", + total_change_value, + new_change_value + ); + + // Rebuild the transaction with reduced change + let mut new_outputs = recipient_outputs.clone(); + + // Get change address from primary wallet + let change_script = + primary_wallet.peek_address(KeychainKind::Internal, 0).address.script_pubkey(); + new_outputs.push(TxOut { + value: Amount::from_sat(new_change_value), + script_pubkey: change_script, + }); + + // Create clean unsigned inputs (preserve outpoint and sequence, clear script/witness) + let unsigned_inputs: Vec = original_tx + .input + .iter() + .map(|input| bitcoin::TxIn { + previous_output: input.previous_output, + script_sig: ScriptBuf::new(), + sequence: input.sequence, + witness: bitcoin::Witness::new(), + }) + .collect(); + + // Create unsigned transaction + let unsigned_tx = Transaction { + version: original_tx.version, + lock_time: original_tx.lock_time, + input: unsigned_inputs, + output: new_outputs, + }; + + // Sign with all wallets that own inputs + let signed_tx = self.sign_owned_inputs(unsigned_tx).map_err(|_| { + log_error!(self.logger, "Failed to sign cross-wallet RBF transaction"); + Error::OnchainTxSigningFailed + })?; + + return Ok(signed_tx); + } + + // Second approach: Need to add more inputs + // Collect the original inputs as outpoints + let original_outpoints: Vec = + original_tx.input.iter().map(|i| i.previous_output).collect(); + + log_info!(self.logger, "Cross-wallet RBF: change insufficient, adding more inputs"); + + // Prepare UTXO info for original inputs + let original_utxo_infos = self.prepare_outpoints_for_psbt(&original_outpoints, wallets)?; + + // Get additional available UTXOs from all wallets (excluding ones already used) + let additional_utxos: Vec = wallets + .values() + .flat_map(|w| w.list_unspent()) + .filter(|utxo| !original_outpoints.contains(&utxo.outpoint)) + .collect(); + + // Calculate total from original inputs (try witness_utxo first, then non_witness_utxo) + let original_input_value: u64 = original_utxo_infos + .iter() + .zip(original_outpoints.iter()) + .filter_map(|(u, outpoint)| { + // First try witness_utxo + if let Some(txout) = &u.psbt_input.witness_utxo { + return Some(txout.value.to_sat()); } - }, - Err(err) => { - log_error!(self.logger, "Failed to sign RBF transaction: {}", err); - return Err(Error::OnchainTxSigningFailed); - }, + // For legacy inputs, get value from non_witness_utxo + if let Some(tx) = &u.psbt_input.non_witness_utxo { + if let Some(txout) = tx.output.get(outpoint.vout as usize) { + return Some(txout.value.to_sat()); + } + } + log_error!(self.logger, "Could not determine value for input {:?}", outpoint); + None + }) + .sum(); + + // Calculate recipient value + let recipient_value: u64 = recipient_outputs.iter().map(|o| o.value.to_sat()).sum(); + + // Select additional UTXOs if needed, accounting for their weight contribution to fees. + // Each input we add increases tx weight, which increases required fee. + let mut selected_additional: Vec = Vec::new(); + let mut additional_value: u64 = 0; + let mut additional_weight = Weight::ZERO; + + // Calculate how much we need: value shortfall + fee for additional weight + let base_shortfall = (recipient_value + new_fee_sats).saturating_sub(original_input_value); + + if base_shortfall > 0 && additional_utxos.is_empty() { + log_error!( + self.logger, + "Need {} more sats but no additional UTXOs available", + base_shortfall + ); + return Err(Error::InsufficientFunds); + } + + // Greedy selection accounting for input weight + for utxo in additional_utxos { + // Calculate fee contribution of this input + let input_weight = Self::calculate_utxo_weight(&utxo.txout.script_pubkey); + let input_fee_cost = new_fee_rate.to_sat_per_kwu() as u64 * input_weight.to_wu() / 1000; + + // Total needed = base shortfall + fee for all additional inputs so far + let total_additional_fee = + new_fee_rate.to_sat_per_kwu() as u64 * additional_weight.to_wu() / 1000; + let total_needed = base_shortfall + total_additional_fee + input_fee_cost; + + if additional_value >= total_needed { + break; + } + + additional_value += utxo.txout.value.to_sat(); + additional_weight = additional_weight + input_weight; + selected_additional.push(utxo); + } + + // Final check: do we have enough? + let final_additional_fee = + new_fee_rate.to_sat_per_kwu() as u64 * additional_weight.to_wu() / 1000; + let final_needed = base_shortfall + final_additional_fee; + + if additional_value < final_needed { + log_error!( + self.logger, + "Insufficient funds: need {} more (including {} for input fees), only have {}", + final_needed, + final_additional_fee, + additional_value + ); + return Err(Error::InsufficientFunds); } - // Persist wallet changes - let mut locked_persister = self.persister.lock().unwrap(); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; + // Update total fee to include additional input weight + let new_fee_sats = new_fee_sats + final_additional_fee; - // Extract and broadcast the transaction - let tx = psbt.extract_tx().map_err(|e| { - log_error!(self.logger, "Failed to extract transaction: {}", e); - Error::OnchainTxCreationFailed - })?; + // Build new transaction with all inputs + // Create clean unsigned inputs (preserve outpoint and sequence, clear script/witness) + let mut all_inputs: Vec = original_tx + .input + .iter() + .map(|input| bitcoin::TxIn { + previous_output: input.previous_output, + script_sig: ScriptBuf::new(), + sequence: input.sequence, + witness: bitcoin::Witness::new(), + }) + .collect(); - self.broadcaster.broadcast_transactions(&[&tx]); + // Add additional inputs with RBF-signaling sequence + for utxo in &selected_additional { + all_inputs.push(bitcoin::TxIn { + previous_output: utxo.outpoint, + script_sig: ScriptBuf::new(), + sequence: bitcoin::Sequence::ENABLE_RBF_NO_LOCKTIME, + witness: bitcoin::Witness::new(), + }); + } - let new_txid = tx.compute_txid(); + // Calculate new change + let total_input = original_input_value + additional_value; + let new_change = total_input.saturating_sub(recipient_value + new_fee_sats); + + // Build outputs + let mut new_outputs = recipient_outputs; + if new_change >= 546 { + let change_script = + primary_wallet.peek_address(KeychainKind::Internal, 0).address.script_pubkey(); + new_outputs + .push(TxOut { value: Amount::from_sat(new_change), script_pubkey: change_script }); + } - // Calculate and log the actual fee increase achieved - let new_fee = locked_wallet.calculate_fee(&tx).unwrap_or(Amount::ZERO); - let actual_fee_rate = new_fee / tx.weight(); + let unsigned_tx = Transaction { + version: original_tx.version, + lock_time: original_tx.lock_time, + input: all_inputs, + output: new_outputs, + }; - log_info!(self.logger, "RBF transaction created successfully!"); - log_info!( - self.logger, - " Original: {} ({} sat/vB, {} sats fee)", - txid, - original_fee_rate.to_sat_per_vb_ceil(), - original_fee.to_sat() - ); - log_info!( - self.logger, - " Replacement: {} ({} sat/vB, {} sats fee)", - new_txid, - actual_fee_rate.to_sat_per_vb_ceil(), - new_fee.to_sat() - ); log_info!( self.logger, - " Additional fee paid: {} sats", - new_fee.to_sat().saturating_sub(original_fee.to_sat()) + "Cross-wallet RBF: built tx with {} inputs, {} outputs, ~{} change", + unsigned_tx.input.len(), + unsigned_tx.output.len(), + new_change ); - Ok(new_txid) + // Sign with all wallets that own inputs + let signed_tx = self.sign_owned_inputs(unsigned_tx).map_err(|_| { + log_error!(self.logger, "Failed to sign cross-wallet RBF transaction"); + Error::OnchainTxSigningFailed + })?; + + Ok(signed_tx) } // Accelerates confirmation of a transaction using Child-Pays-For-Parent (CPFP). // Returns the txid of the child transaction if successful. + // + // For cross-wallet transactions, change typically goes to the primary wallet, + // so this method searches all wallets for spendable outputs from the parent transaction. pub(crate) fn accelerate_by_cpfp( &self, txid: &Txid, fee_rate: FeeRate, destination_address: Option
, ) -> Result { - let mut locked_wallet = self.inner.lock().unwrap(); + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); - // Find the transaction in the wallet - let parent_tx_node = locked_wallet.get_tx(*txid).ok_or_else(|| { - log_error!(self.logger, "Transaction not found in wallet: {}", txid); + // Find the transaction in any wallet to get its details + let tx_wallet_type = Self::find_wallet_for_tx(&wallets, *txid).ok_or_else(|| { + log_error!(self.logger, "Transaction not found in any wallet: {}", txid); Error::TransactionNotFound })?; - // Check if transaction is confirmed - can't accelerate confirmed transactions - if parent_tx_node.chain_position.is_confirmed() { - log_error!(self.logger, "Cannot accelerate confirmed transaction: {}", txid); - return Err(Error::TransactionAlreadyConfirmed); + // Get transaction info first (read-only) + let (parent_tx, parent_fee, parent_fee_rate) = { + let wallet_ref = wallets.get(&tx_wallet_type).unwrap(); + let parent_tx_node = wallet_ref.get_tx(*txid).unwrap(); + + // Check if transaction is confirmed - can't accelerate confirmed transactions + if parent_tx_node.chain_position.is_confirmed() { + log_error!(self.logger, "Cannot accelerate confirmed transaction: {}", txid); + return Err(Error::TransactionAlreadyConfirmed); + } + + let parent_tx = &parent_tx_node.tx_node.tx; + let parent_fee = wallet_ref.calculate_fee(parent_tx).map_err(|e| { + log_error!(self.logger, "Failed to calculate parent fee: {}", e); + Error::WalletOperationFailed + })?; + let parent_fee_rate = parent_fee / parent_tx.weight(); + + (parent_tx.clone(), parent_fee, parent_fee_rate) + }; + + // Search ALL wallets for spendable outputs from the parent transaction + // Change typically goes to the primary wallet, but we check all to be robust + let mut utxos: Vec = Vec::new(); + let mut utxo_wallet_type: Option = None; + + // Check primary wallet first (change usually goes here) + if let Some(primary_wallet) = wallets.get(&self.config.address_type) { + let primary_utxos: Vec<_> = + primary_wallet.list_unspent().filter(|utxo| utxo.outpoint.txid == *txid).collect(); + if !primary_utxos.is_empty() { + utxos = primary_utxos; + utxo_wallet_type = Some(self.config.address_type); + } } - // Calculate parent transaction fee and fee rate for validation - let parent_tx = &parent_tx_node.tx_node.tx; - let parent_fee = locked_wallet.calculate_fee(parent_tx).map_err(|e| { - log_error!(self.logger, "Failed to calculate parent fee: {}", e); - Error::WalletOperationFailed - })?; + // If no UTXOs found in primary, check other wallets + if utxos.is_empty() { + for (addr_type, wallet) in wallets.iter() { + let wallet_utxos: Vec<_> = + wallet.list_unspent().filter(|utxo| utxo.outpoint.txid == *txid).collect(); + if !wallet_utxos.is_empty() { + utxos = wallet_utxos; + utxo_wallet_type = Some(*addr_type); + break; + } + } + } - // Use Bitcoin crate's built-in fee rate calculation for accuracy - let parent_fee_rate = parent_fee / parent_tx.weight(); + let address_type = utxo_wallet_type.ok_or_else(|| { + log_error!(self.logger, "No spendable outputs found for transaction: {}", txid); + Error::NoSpendableOutputs + })?; // Log detailed information for debugging log_info!(self.logger, "CPFP Analysis for transaction {}", txid); @@ -452,18 +1338,9 @@ impl Wallet { ); } - // Find spendable outputs from this transaction - let utxos = locked_wallet - .list_unspent() - .filter(|utxo| utxo.outpoint.txid == *txid) - .collect::>(); - - if utxos.is_empty() { - log_error!(self.logger, "No spendable outputs found for transaction: {}", txid); - return Err(Error::NoSpendableOutputs); - } - + // utxos is guaranteed non-empty at this point (handled by address_type check above) log_info!(self.logger, "Found {} spendable output(s) from parent transaction", utxos.len()); + log_info!(self.logger, " UTXOs found in {:?} wallet", address_type); let total_input_value: u64 = utxos.iter().map(|utxo| utxo.txout.value.to_sat()).sum(); log_info!(self.logger, " Total input value: {} sats", total_input_value); @@ -471,24 +1348,45 @@ impl Wallet { let script_pubkey = match destination_address { Some(addr) => { log_info!(self.logger, " Destination: {} (user-specified)", addr); - // Validate the address self.parse_and_validate_address(&addr)?; addr.script_pubkey() }, None => { - // Create a new address to send the funds to - let address_info = locked_wallet.next_unused_address(KeychainKind::Internal); + // Create a new address to send the funds to (use primary wallet for change) + // Need to release current lock and get primary wallet + drop(wallets); + drop(persisters); + let mut wallets_primary = self.wallets.lock().unwrap(); + let mut persisters_primary = self.persisters.lock().unwrap(); + let primary_wallet = wallets_primary.get_mut(&self.config.address_type).unwrap(); + let address_info = primary_wallet.next_unused_address(KeychainKind::Internal); + primary_wallet + .persist(persisters_primary.get_mut(&self.config.address_type).unwrap()) + .map_err(|e| { + log_error!(self.logger, "Failed to persist wallet: {}", e); + Error::PersistenceFailed + })?; log_info!( self.logger, " Destination: {} (wallet internal address)", address_info.address ); - address_info.address.script_pubkey() + let script_pubkey = address_info.address.script_pubkey(); + drop(wallets_primary); + drop(persisters_primary); + // Re-acquire locks for the original wallet + wallets = self.wallets.lock().unwrap(); + persisters = self.persisters.lock().unwrap(); + script_pubkey }, }; + // Now get mutable references for building the transaction + let wallet = wallets.get_mut(&address_type).unwrap(); + let persister = persisters.get_mut(&address_type).unwrap(); + // Build a transaction that spends these UTXOs - let mut tx_builder = locked_wallet.build_tx(); + let mut tx_builder = wallet.build_tx(); // Add the UTXOs explicitly for utxo in &utxos { @@ -520,7 +1418,7 @@ impl Wallet { }; // Sign the transaction - match locked_wallet.sign(&mut psbt, SignOptions::default()) { + match wallet.sign(&mut psbt, SignOptions::default()) { Ok(finalized) => { if !finalized { log_error!(self.logger, "Failed to finalize CPFP transaction"); @@ -534,8 +1432,7 @@ impl Wallet { } // Persist wallet changes - let mut locked_persister = self.persister.lock().unwrap(); - locked_wallet.persist(&mut locked_persister).map_err(|e| { + wallet.persist(persister).map_err(|e| { log_error!(self.logger, "Failed to persist wallet: {}", e); Error::PersistenceFailed })?; @@ -551,7 +1448,7 @@ impl Wallet { let child_txid = tx.compute_txid(); // Calculate and log the actual results - let child_fee = locked_wallet.calculate_fee(&tx).unwrap_or(Amount::ZERO); + let child_fee = wallet.calculate_fee(&tx).unwrap_or(Amount::ZERO); let actual_child_fee_rate = child_fee / tx.weight(); log_info!(self.logger, "CPFP transaction created successfully!"); @@ -583,10 +1480,18 @@ impl Wallet { pub(crate) fn calculate_cpfp_fee_rate( &self, parent_txid: &Txid, urgent: bool, ) -> Result { - let locked_wallet = self.inner.lock().unwrap(); + // Find which wallet contains this transaction + let wallets = self.wallets.lock().unwrap(); + let wallet = wallets + .values() + .find_map(|wallet| wallet.get_tx(*parent_txid).map(|_| wallet)) + .ok_or_else(|| { + log_error!(self.logger, "Transaction not found in any wallet: {}", parent_txid); + Error::TransactionNotFound + })?; // Get the parent transaction - let parent_tx_node = locked_wallet.get_tx(*parent_txid).ok_or_else(|| { + let parent_tx_node = wallet.get_tx(*parent_txid).ok_or_else(|| { log_error!(self.logger, "Transaction not found in wallet: {}", parent_txid); Error::TransactionNotFound })?; @@ -600,7 +1505,7 @@ impl Wallet { let parent_tx = &parent_tx_node.tx_node.tx; // Calculate parent fee and fee rate using accurate method - let parent_fee = locked_wallet.calculate_fee(parent_tx).map_err(|e| { + let parent_fee = wallet.calculate_fee(parent_tx).map_err(|e| { log_error!(self.logger, "Failed to calculate parent fee: {}", e); Error::WalletOperationFailed })?; @@ -692,64 +1597,178 @@ impl Wallet { Ok(child_fee_rate) } - fn update_payment_store<'a>( - &self, locked_wallet: &'a mut PersistedWallet, - ) -> Result<(), Error> { - for wtx in locked_wallet.transactions() { - let id = PaymentId(wtx.tx_node.txid.to_byte_array()); - let txid = wtx.tx_node.txid; - let (payment_status, confirmation_status) = match wtx.chain_position { - bdk_chain::ChainPosition::Confirmed { anchor, .. } => { - let confirmation_height = anchor.block_id.height; - let cur_height = locked_wallet.latest_checkpoint().height(); - let payment_status = if cur_height >= confirmation_height + ANTI_REORG_DELAY - 1 - { - PaymentStatus::Succeeded - } else { - PaymentStatus::Pending - }; - let confirmation_status = ConfirmationStatus::Confirmed { - block_hash: anchor.block_id.hash, - height: confirmation_height, - timestamp: anchor.confirmation_time, - }; - (payment_status, confirmation_status) - }, - bdk_chain::ChainPosition::Unconfirmed { .. } => { - (PaymentStatus::Pending, ConfirmationStatus::Unconfirmed) - }, + pub(crate) fn update_payment_store_for_txids(&self, txids: Vec) -> Result<(), Error> { + // Get read access to all wallets to aggregate amounts + let wallets_read = self.wallets.lock().unwrap(); + + // Get the maximum chain height across all wallets to ensure consistent confirmation status + // This prevents issues where non-primary wallets might be behind in sync + let max_chain_height = + wallets_read.values().map(|w| w.latest_checkpoint().height()).max().unwrap_or(0); + + for txid in txids { + // Find the transaction in any wallet to get transaction data + // Check ALL wallets that have this transaction to get the most up-to-date confirmation status + // This is important because a transaction can exist in multiple wallets (different address types) + // and we want to use the confirmed status if ANY wallet shows it as confirmed + let mut tx_clone_opt = None; + let mut confirmation_status_opt = None; + let mut payment_status_opt = None; + + // First pass: find the transaction and collect confirmation status from all wallets + // Prefer confirmed status over unconfirmed if any wallet shows it as confirmed + let mut found_in_wallet_count = 0; + for (addr_type, wallet) in wallets_read.iter() { + if let Some(tx_node) = wallet.get_tx(txid) { + found_in_wallet_count += 1; + let is_confirmed = tx_node.chain_position.is_confirmed(); + log_debug!( + self.logger, + "Transaction {} found in wallet {:?}: confirmed={}, max_chain_height={}", + txid, + addr_type, + is_confirmed, + max_chain_height + ); + + // Get transaction data (only need to do this once) + if tx_clone_opt.is_none() { + tx_clone_opt = Some((*tx_node.tx_node.tx).clone()); + } + + // Check confirmation status from this wallet + match tx_node.chain_position { + bdk_chain::ChainPosition::Confirmed { anchor, .. } => { + // If we already have a confirmed status, keep the one with the lower height (earlier confirmation) + // Otherwise, use this confirmed status + let confirmation_height = anchor.block_id.height; + // Use the max chain height across all wallets for consistent status determination + let cur_height = max_chain_height; + + match &confirmation_status_opt { + Some(ConfirmationStatus::Confirmed { + height: existing_height, + .. + }) => { + // Keep the earlier confirmation (lower height) + if confirmation_height < *existing_height { + payment_status_opt = Some( + if cur_height + >= confirmation_height + ANTI_REORG_DELAY - 1 + { + PaymentStatus::Succeeded + } else { + PaymentStatus::Pending + }, + ); + confirmation_status_opt = + Some(ConfirmationStatus::Confirmed { + block_hash: anchor.block_id.hash, + height: confirmation_height, + timestamp: anchor.confirmation_time, + }); + } + }, + Some(ConfirmationStatus::Unconfirmed) => { + // Upgrade from unconfirmed to confirmed + payment_status_opt = Some( + if cur_height >= confirmation_height + ANTI_REORG_DELAY - 1 + { + PaymentStatus::Succeeded + } else { + PaymentStatus::Pending + }, + ); + confirmation_status_opt = Some(ConfirmationStatus::Confirmed { + block_hash: anchor.block_id.hash, + height: confirmation_height, + timestamp: anchor.confirmation_time, + }); + }, + None => { + // First time seeing this transaction + payment_status_opt = Some( + if cur_height >= confirmation_height + ANTI_REORG_DELAY - 1 + { + PaymentStatus::Succeeded + } else { + PaymentStatus::Pending + }, + ); + confirmation_status_opt = Some(ConfirmationStatus::Confirmed { + block_hash: anchor.block_id.hash, + height: confirmation_height, + timestamp: anchor.confirmation_time, + }); + }, + } + }, + bdk_chain::ChainPosition::Unconfirmed { .. } => { + // Only set unconfirmed if we don't already have a confirmed status + if confirmation_status_opt.is_none() { + payment_status_opt = Some(PaymentStatus::Pending); + confirmation_status_opt = Some(ConfirmationStatus::Unconfirmed); + } + }, + } + } + } + + let tx = match tx_clone_opt { + Some(tx) => tx, + None => continue, // Transaction not found in any wallet, skip }; - // TODO: It would be great to introduce additional variants for - // `ChannelFunding` and `ChannelClosing`. For the former, we could just - // take a reference to `ChannelManager` here and check against - // `list_channels`. But for the latter the best approach is much less - // clear: for force-closes/HTLC spends we should be good querying - // `OutputSweeper::tracked_spendable_outputs`, but regular channel closes - // (i.e., `SpendableOutputDescriptor::StaticOutput` variants) are directly - // spent to a wallet address. The only solution I can come up with is to - // create and persist a list of 'static pending outputs' that we could use - // here to determine the `PaymentKind`, but that's not really satisfactory, so - // we're punting on it until we can come up with a better solution. + + let (payment_status, confirmation_status) = + match (payment_status_opt, confirmation_status_opt) { + (Some(ps), Some(cs)) => (ps, cs), + _ => continue, // Couldn't determine status, skip + }; + + // Aggregate sent and received amounts across ALL wallets that have this transaction + // This is necessary because a transaction can use UTXOs from multiple wallets + let mut total_sent = 0u64; + let mut total_received = 0u64; + + // Get primary wallet for fee calculation + let primary_wallet = wallets_read.get(&self.config.address_type).unwrap(); + let total_fee = primary_wallet.calculate_fee(&tx).unwrap_or(Amount::ZERO).to_sat(); + + for wallet in wallets_read.values() { + if wallet.get_tx(txid).is_some() { + let (sent, received) = wallet.sent_and_received(&tx); + total_sent += sent.to_sat(); + total_received += received.to_sat(); + } + } + + let id = PaymentId(txid.to_byte_array()); let kind = crate::payment::PaymentKind::Onchain { txid, status: confirmation_status }; - let fee = locked_wallet.calculate_fee(&wtx.tx_node.tx).unwrap_or(Amount::ZERO); - let (sent, received) = locked_wallet.sent_and_received(&wtx.tx_node.tx); - let (direction, amount_msat) = if sent > received { + + let (direction, amount_msat) = if total_sent > total_received { let direction = PaymentDirection::Outbound; let amount_msat = Some( - sent.to_sat().saturating_sub(fee.to_sat()).saturating_sub(received.to_sat()) - * 1000, + total_sent.saturating_sub(total_fee).saturating_sub(total_received) * 1000, ); (direction, amount_msat) } else { let direction = PaymentDirection::Inbound; let amount_msat = Some( - received.to_sat().saturating_sub(sent.to_sat().saturating_sub(fee.to_sat())) - * 1000, + total_received.saturating_sub(total_sent.saturating_sub(total_fee)) * 1000, ); (direction, amount_msat) }; - let fee_paid_msat = Some(fee.to_sat() * 1000); + let fee_paid_msat = Some(total_fee * 1000); + + log_debug!( + self.logger, + "Updating payment store for txid {}: status={:?}, confirmation={:?}, found_in_wallets={}", + txid, + payment_status, + confirmation_status, + found_in_wallet_count + ); let payment = PaymentDetails::new( id, @@ -763,9 +1782,29 @@ impl Wallet { self.payment_store.insert_or_update(payment)?; } + drop(wallets_read); + Ok(()) } + pub(crate) fn update_payment_store_for_all_transactions(&self) -> Result<(), Error> { + let wallets_read = self.wallets.lock().unwrap(); + let mut all_txids = std::collections::HashSet::new(); + for wallet in wallets_read.values() { + for wtx in wallet.transactions() { + all_txids.insert(wtx.tx_node.txid); + } + } + drop(wallets_read); + + if !all_txids.is_empty() { + let txids_vec: Vec = all_txids.into_iter().collect(); + self.update_payment_store_for_txids(txids_vec) + } else { + Ok(()) + } + } + #[allow(deprecated)] pub(crate) fn create_funding_transaction( &self, output_script: ScriptBuf, amount: Amount, confirmation_target: ConfirmationTarget, @@ -773,11 +1812,58 @@ impl Wallet { ) -> Result { let fee_rate = self.fee_estimator.estimate_fee_rate(confirmation_target); - let mut locked_wallet = self.inner.lock().unwrap(); - let mut tx_builder = locked_wallet.build_tx(); + // Support multi-wallet funding: collect UTXOs from all wallets and manually add them + // Note: Lightning requires the funding OUTPUT to be a witness script (P2WSH), + // but the INPUTS can be any type of UTXO (Legacy, NestedSegwit, NativeSegwit, Taproot). + // The change output will use a witness address via ChangeDestinationSource. + + // Get all spendable UTXOs from all wallets (excluding funding transactions) + // We need a channel_manager to filter funding transactions, but we don't have one here. + // For funding transactions, we can skip the funding transaction filter since we're creating one. + let wallets_read = self.wallets.lock().unwrap(); + let all_available_utxos: Vec<_> = + wallets_read.values().flat_map(|w| w.list_unspent()).collect(); + drop(wallets_read); + + // Get drain script (change address) from primary wallet + let drain_script = { + let wallets_read = self.wallets.lock().unwrap(); + let primary_wallet = wallets_read.get(&self.config.address_type).unwrap(); + primary_wallet.peek_address(KeychainKind::Internal, 0).address.script_pubkey() + }; + + // Select UTXOs from all wallets using coin selection algorithm + // We pass None for channel_manager since we're creating a funding transaction + // and don't need to filter existing funding transactions + let selected_outpoints = self.select_utxos_with_algorithm( + amount.to_sat(), + all_available_utxos, + fee_rate, + CoinSelectionAlgorithm::LargestFirst, // Use LargestFirst as default + &drain_script, + None, // No channel_manager - skip funding transaction filtering + )?; + + // Prepare UTXO info for selected UTXOs from all wallets + let wallets_read = self.wallets.lock().unwrap(); + let utxo_infos = self.prepare_outpoints_for_psbt(&selected_outpoints, &wallets_read)?; + drop(wallets_read); + // Build transaction with selected UTXOs from all wallets + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + let wallet = wallets.get_mut(&self.config.address_type).ok_or_else(|| { + log_error!(self.logger, "Primary wallet not found"); + Error::WalletOperationFailed + })?; + + let mut tx_builder = wallet.build_tx(); tx_builder.add_recipient(output_script, amount).fee_rate(fee_rate).nlocktime(locktime); + // Add selected UTXOs using helper + self.add_utxos_to_tx_builder(&mut tx_builder, &utxo_infos)?; + tx_builder.manually_selected_only(); + let mut psbt = match tx_builder.finish() { Ok(psbt) => { log_trace!(self.logger, "Created funding PSBT: {:?}", psbt); @@ -789,23 +1875,54 @@ impl Wallet { }, }; - match locked_wallet.sign(&mut psbt, SignOptions::default()) { - Ok(finalized) => { - if !finalized { - return Err(Error::OnchainTxCreationFailed); + // Sign with each wallet that owns inputs in the transaction + // This is necessary because inputs might come from different wallets + let mut wallet_inputs: HashMap> = HashMap::new(); + for (i, txin) in psbt.unsigned_tx.input.iter().enumerate() { + // Find which wallet owns this UTXO + for (addr_type, w) in wallets.iter() { + if w.get_utxo(txin.previous_output).is_some() { + wallet_inputs.entry(*addr_type).or_insert_with(Vec::new).push(i); + break; } - }, - Err(err) => { - log_error!(self.logger, "Failed to create funding transaction: {}", err); - return Err(err.into()); - }, + } } - let mut locked_persister = self.persister.lock().unwrap(); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; + // Sign with each wallet that has inputs + for (addr_type, _input_indices) in wallet_inputs { + let wallet = wallets.get_mut(&addr_type).ok_or_else(|| { + log_error!(self.logger, "Wallet not found for address type {:?}", addr_type); + Error::WalletOperationFailed + })?; + let persister = persisters.get_mut(&addr_type).ok_or_else(|| { + log_error!(self.logger, "Persister not found for address type {:?}", addr_type); + Error::WalletOperationFailed + })?; + + let mut sign_options = SignOptions::default(); + sign_options.trust_witness_utxo = true; + + match wallet.sign(&mut psbt, sign_options) { + Ok(_finalized) => { + // Note: finalized might be false if there are other unsigned inputs, which is expected + }, + Err(err) => { + log_error!( + self.logger, + "Failed to sign funding transaction with wallet {:?}: {}", + addr_type, + err + ); + return Err(Error::OnchainTxCreationFailed); + }, + } + + // Persist the wallet after signing + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet {:?}: {}", addr_type, e); + Error::PersistenceFailed + })?; + } let tx = psbt.extract_tx().map_err(|e| { log_error!(self.logger, "Failed to extract transaction: {}", e); @@ -816,38 +1933,108 @@ impl Wallet { } pub(crate) fn get_new_address(&self) -> Result { - let mut locked_wallet = self.inner.lock().unwrap(); - let mut locked_persister = self.persister.lock().unwrap(); + self.get_new_address_for_type(self.config.address_type) + } - let address_info = locked_wallet.reveal_next_address(KeychainKind::External); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; - Ok(address_info.address) + pub(crate) fn get_new_address_for_type( + &self, address_type: AddressType, + ) -> Result { + self.with_wallet_mut(address_type, |wallet, persister| { + let address_info = wallet.reveal_next_address(KeychainKind::External); + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet: {}", e); + Error::PersistenceFailed + })?; + Ok(address_info.address) + }) } pub(crate) fn get_new_internal_address(&self) -> Result { - let mut locked_wallet = self.inner.lock().unwrap(); - let mut locked_persister = self.persister.lock().unwrap(); + self.with_primary_wallet_mut(|wallet, persister| { + let address_info = wallet.next_unused_address(KeychainKind::Internal); + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet: {}", e); + Error::PersistenceFailed + })?; + Ok(address_info.address) + }) + } - let address_info = locked_wallet.next_unused_address(KeychainKind::Internal); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; - Ok(address_info.address) + // Internal helper for getting witness addresses for Lightning channel operations. + fn get_witness_address_impl(&self, keychain: KeychainKind) -> Result { + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + + // Helper closure to get address from a wallet + let get_address = |wallet: &mut PersistedWallet, + persister: &mut KVStoreWalletPersister, + keychain: KeychainKind| + -> Result { + let address_info = match keychain { + KeychainKind::External => wallet.reveal_next_address(keychain), + KeychainKind::Internal => wallet.next_unused_address(keychain), + }; + wallet.persist(persister).map_err(|_| Error::PersistenceFailed)?; + Ok(address_info.address) + }; + + // If primary wallet is already a witness type, use it + if matches!(self.config.address_type, AddressType::NativeSegwit | AddressType::Taproot) { + if let (Some(wallet), Some(persister)) = ( + wallets.get_mut(&self.config.address_type), + persisters.get_mut(&self.config.address_type), + ) { + return get_address(wallet, persister, keychain); + } + } + + // Try NativeSegwit first + if let (Some(wallet), Some(persister)) = ( + wallets.get_mut(&AddressType::NativeSegwit), + persisters.get_mut(&AddressType::NativeSegwit), + ) { + return get_address(wallet, persister, keychain); + } + + // Fall back to Taproot + if let (Some(wallet), Some(persister)) = + (wallets.get_mut(&AddressType::Taproot), persisters.get_mut(&AddressType::Taproot)) + { + return get_address(wallet, persister, keychain); + } + + // If no witness wallet is available, this is a configuration error + log_error!( + self.logger, + "No witness wallet (NativeSegwit or Taproot) available for Lightning operations" + ); + Err(Error::WalletOperationFailed) } - pub(crate) fn cancel_tx(&self, tx: &Transaction) -> Result<(), Error> { - let mut locked_wallet = self.inner.lock().unwrap(); - let mut locked_persister = self.persister.lock().unwrap(); + // Get a new witness address for Lightning channel operations. + pub(crate) fn get_new_witness_address(&self) -> Result { + self.get_witness_address_impl(KeychainKind::External) + } - locked_wallet.cancel_tx(tx); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; + // Get a new witness internal address for Lightning channel operations. + pub(crate) fn get_new_witness_internal_address(&self) -> Result { + self.get_witness_address_impl(KeychainKind::Internal) + } + + pub(crate) fn cancel_tx(&self, tx: &Transaction) -> Result<(), Error> { + // Cancel transaction in all wallets (in case it exists in multiple) + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + + for (address_type, wallet) in wallets.iter_mut() { + wallet.cancel_tx(tx); + if let Some(persister) = persisters.get_mut(address_type) { + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet {:?}: {}", address_type, e); + Error::PersistenceFailed + })?; + } + } Ok(()) } @@ -855,28 +2042,26 @@ impl Wallet { pub(crate) fn get_balances( &self, total_anchor_channels_reserve_sats: u64, ) -> Result<(u64, u64), Error> { - let balance = self.inner.lock().unwrap().balance(); - - // Make sure `list_confirmed_utxos` returns at least one `Utxo` we could use to spend/bump - // Anchors if we have any confirmed amounts. - #[cfg(debug_assertions)] - if balance.confirmed != Amount::ZERO { - debug_assert!( - self.list_confirmed_utxos_inner().map_or(false, |v| !v.is_empty()), - "Confirmed amounts should always be available for Anchor spending" - ); - } - + let balance = self.get_aggregate_balance(); self.get_balances_inner(balance, total_anchor_channels_reserve_sats) } fn get_balances_inner( &self, balance: Balance, total_anchor_channels_reserve_sats: u64, ) -> Result<(u64, u64), Error> { + // Calculate trusted_spendable manually to avoid Amount subtraction underflow panic + // trusted_spendable = confirmed + trusted_pending - untrusted_pending + // We use saturating operations on satoshis to avoid panics + let confirmed_sats = balance.confirmed.to_sat(); + let trusted_pending_sats = balance.trusted_pending.to_sat(); + let untrusted_pending_sats = balance.untrusted_pending.to_sat(); + let trusted_spendable_sats = + (confirmed_sats + trusted_pending_sats).saturating_sub(untrusted_pending_sats); + let spendable_base = if self.config.include_untrusted_pending_in_spendable { - balance.trusted_spendable().to_sat() + balance.untrusted_pending.to_sat() + trusted_spendable_sats + untrusted_pending_sats } else { - balance.trusted_spendable().to_sat() + trusted_spendable_sats }; let (total, spendable) = ( @@ -893,14 +2078,71 @@ impl Wallet { self.get_balances(total_anchor_channels_reserve_sats).map(|(_, s)| s) } + // Get the balance for a specific address type. + // Returns (total_sats, spendable_sats) for the specified wallet. + pub(crate) fn get_balance_for_address_type( + &self, address_type: AddressType, + ) -> Result<(u64, u64), Error> { + let wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get(&address_type).ok_or_else(|| { + log_error!(self.logger, "Wallet not found for address type {:?}", address_type); + Error::WalletOperationFailed + })?; + + let balance = wallet.balance(); + drop(wallets); + + let confirmed_sats = balance.confirmed.to_sat(); + let trusted_pending_sats = balance.trusted_pending.to_sat(); + let untrusted_pending_sats = balance.untrusted_pending.to_sat(); + let trusted_spendable_sats = + (confirmed_sats + trusted_pending_sats).saturating_sub(untrusted_pending_sats); + + let spendable = if self.config.include_untrusted_pending_in_spendable { + trusted_spendable_sats + untrusted_pending_sats + } else { + trusted_spendable_sats + }; + + Ok((balance.total().to_sat(), spendable)) + } + + // Get all loaded address types (primary + monitored). + pub(crate) fn get_loaded_address_types(&self) -> Vec { + let wallets = self.wallets.lock().unwrap(); + wallets.keys().copied().collect() + } + // Get transaction details including inputs, outputs, and net amount. // Returns None if the transaction is not found in the wallet. pub(crate) fn get_tx_details(&self, txid: &Txid) -> Option<(i64, Vec, Vec)> { - let locked_wallet = self.inner.lock().unwrap(); - let tx_node = locked_wallet.get_tx(*txid)?; - let tx = &tx_node.tx_node.tx; - let (sent, received) = locked_wallet.sent_and_received(tx); - let net_amount = received.to_sat() as i64 - sent.to_sat() as i64; + // Check all wallets for the transaction + let wallets = self.wallets.lock().unwrap(); + + // First, find the transaction in any wallet to get the transaction data + let mut tx_clone_opt = None; + for wallet in wallets.values() { + if let Some(tx_node) = wallet.get_tx(*txid) { + tx_clone_opt = Some((*tx_node.tx_node.tx).clone()); + break; + } + } + + let tx = tx_clone_opt?; + + // Aggregate sent and received amounts across ALL wallets that have this transaction + // This is necessary because a transaction can use UTXOs from multiple wallets + let mut total_sent = 0u64; + let mut total_received = 0u64; + for wallet in wallets.values() { + if wallet.get_tx(*txid).is_some() { + let (sent, received) = wallet.sent_and_received(&tx); + total_sent += sent.to_sat(); + total_received += received.to_sat(); + } + } + + let net_amount = total_received as i64 - total_sent as i64; let inputs: Vec = tx.input.iter().map(|tx_input| TxInput::from_tx_input(tx_input)).collect(); @@ -928,10 +2170,13 @@ impl Wallet { pub fn get_spendable_utxos( &self, channel_manager: &ChannelManager, ) -> Result, Error> { - let locked_wallet = self.inner.lock().unwrap(); - - // Get all unspent outputs from the wallet - let all_utxos: Vec = locked_wallet.list_unspent().collect(); + // Collect UTXOs from all wallets + let wallets = self.wallets.lock().unwrap(); + let mut all_utxos = Vec::new(); + for wallet in wallets.values() { + let wallet_utxos: Vec = wallet.list_unspent().collect(); + all_utxos.extend(wallet_utxos); + } let total_count = all_utxos.len(); // Filter out channel funding transactions @@ -966,24 +2211,30 @@ impl Wallet { // Returns selected UTXOs that meet the target amount plus fees, excluding channel funding txs. pub fn select_utxos_with_algorithm( &self, target_amount: u64, available_utxos: Vec, fee_rate: FeeRate, - algorithm: CoinSelectionAlgorithm, drain_script: &Script, channel_manager: &ChannelManager, + algorithm: CoinSelectionAlgorithm, drain_script: &Script, + channel_manager: Option<&ChannelManager>, ) -> Result, Error> { - // First, filter out any funding transactions for safety - let safe_utxos: Vec = available_utxos - .into_iter() - .filter(|utxo| { - if self.is_funding_transaction(&utxo.outpoint.txid, channel_manager) { - log_debug!( - self.logger, - "Filtering out UTXO {:?} as it's part of a channel funding transaction", - utxo.outpoint - ); - false - } else { - true - } - }) - .collect(); + // First, filter out any funding transactions for safety (if channel_manager is provided) + let safe_utxos: Vec = if let Some(channel_manager) = channel_manager { + available_utxos + .into_iter() + .filter(|utxo| { + if self.is_funding_transaction(&utxo.outpoint.txid, channel_manager) { + log_debug!( + self.logger, + "Filtering out UTXO {:?} as it's part of a channel funding transaction", + utxo.outpoint + ); + false + } else { + true + } + }) + .collect() + } else { + // No channel_manager provided, skip filtering (e.g., for funding transactions) + available_utxos + }; if safe_utxos.is_empty() { log_error!( @@ -994,12 +2245,14 @@ impl Wallet { } // Use the improved weight calculation from the second implementation - let locked_wallet = self.inner.lock().unwrap(); + // Use primary wallet for weight calculation (all wallets should have similar weight) + let wallets = self.wallets.lock().unwrap(); + let primary_wallet = wallets.get(&self.config.address_type).unwrap(); let weighted_utxos: Vec = safe_utxos .iter() .map(|utxo| { // Use BDK's descriptor to calculate satisfaction weight - let descriptor = locked_wallet.public_descriptor(utxo.keychain); + let descriptor = primary_wallet.public_descriptor(utxo.keychain); let satisfaction_weight = descriptor.max_weight_to_satisfy().unwrap_or_else(|_| { // Fallback to manual calculation if BDK method fails log_debug!( @@ -1022,11 +2275,33 @@ impl Wallet { // Total weight = 41 * 4 + 66 = 230 WU Weight::from_wu(230) }, + None => { + // Non-witness script (P2PKH or P2SH) + // Check if it's P2SH-wrapped SegWit (nested segwit) + // P2SH scripts start with OP_HASH160 (0xa9) followed by 20-byte hash and OP_EQUAL (0x87) + let script_bytes = utxo.txout.script_pubkey.as_bytes(); + if script_bytes.len() == 23 + && script_bytes[0] == 0xa9 + && script_bytes[21] == 0x87 + { + // P2SH-wrapped P2WPKH (nested segwit): + // Non-witness data: 32 + 4 + 23 (redeem script) + 4 = 63 bytes + // Witness data: 1 + 1 + 72 + 1 + 33 = 108 bytes + // Total weight = 63 * 4 + 108 = 360 WU + Weight::from_wu(360) + } else { + // P2PKH (legacy): + // Non-witness data: 32 + 4 + 107 (script_sig: 1 + 72 + 1 + 33) + 4 = 147 bytes + // No witness data + // Total weight = 147 * 4 = 588 WU + Weight::from_wu(588) + } + }, _ => { - // Conservative fallback for unknown script types + // Unknown witness version (future witness versions) log_warn!( self.logger, - "Unknown script type for UTXO {:?}, using conservative weight", + "Unknown witness version for UTXO {:?}, using conservative weight", utxo.outpoint ); Weight::from_wu(272) @@ -1038,8 +2313,6 @@ impl Wallet { }) .collect(); - drop(locked_wallet); - let target = Amount::from_sat(target_amount); let mut rng = OsRng; @@ -1085,13 +2358,12 @@ impl Wallet { }) .collect(); - log_info!( + log_debug!( self.logger, - "Selected {} UTXOs using {:?} algorithm for target {} sats (fee: {} sats)", + "Selected {} UTXOs using {:?} algorithm for target {} sats", selected_outputs.len(), algorithm, target_amount, - result.fee_amount.to_sat(), ); Ok(selected_outputs.into_iter().map(|u| u.outpoint).collect()) } @@ -1101,13 +2373,46 @@ impl Wallet { fn build_transaction_psbt( &self, address: &Address, send_amount: OnchainSendAmount, fee_rate: FeeRate, utxos_to_spend: Option>, channel_manager: &ChannelManager, - ) -> Result<(Psbt, MutexGuard<'_, PersistedWallet>), Error> { - let mut locked_wallet = self.inner.lock().unwrap(); + ) -> Result { + // Validate and check UTXOs if provided - do this BEFORE acquiring mutable lock + let wallet_utxos: Vec<_> = if utxos_to_spend.is_some() { + let wallets_read = self.wallets.lock().unwrap(); + wallets_read.values().flat_map(|w| w.list_unspent()).collect() + } else { + Vec::new() + }; + + // Prepare the tx_builder. We properly check the reserve requirements (again) further down. + + // For AllRetainingReserve, we need balance and UTXOs from all wallets - collect BEFORE getting mutable wallet + let (_balance_for_all, change_address_info_for_all, all_utxos_for_tmp) = if matches!(send_amount, OnchainSendAmount::AllRetainingReserve { cur_anchor_reserve_sats: reserve } if reserve > DUST_LIMIT_SATS) + { + let wallets_read = self.wallets.lock().unwrap(); + let total_balance = Self::get_aggregate_balance_from_wallets(&wallets_read); + let change_address_info = wallets_read + .get(&self.config.address_type) + .unwrap() + .peek_address(KeychainKind::Internal, 0); + let all_unspent_utxos: Vec<_> = + wallets_read.values().flat_map(|w| w.list_unspent()).collect(); + (Some(total_balance), Some(change_address_info), Some(all_unspent_utxos)) + } else { + (None, None, None) + }; + + // For AllDrainingReserve, collect UTXOs from all wallets + let all_utxos_for_drain: Option> = if matches!( + send_amount, + OnchainSendAmount::AllDrainingReserve | OnchainSendAmount::AllRetainingReserve { .. } + ) { + let wallets_read = self.wallets.lock().unwrap(); + Some(wallets_read.values().flat_map(|w| w.list_unspent()).collect()) + } else { + None + }; - // Validate and check UTXOs if provided + // Validate and check UTXOs if provided - do this BEFORE acquiring mutable lock if let Some(ref outpoints) = utxos_to_spend { - // Get all wallet UTXOs for validation - let wallet_utxos: Vec<_> = locked_wallet.list_unspent().collect(); let wallet_utxo_set: std::collections::HashSet<_> = wallet_utxos.iter().map(|u| u.outpoint).collect(); @@ -1132,30 +2437,14 @@ impl Wallet { // Calculate total value of selected UTXOs let selected_value: u64 = wallet_utxos .iter() - .filter(|u| outpoints.contains(&u.outpoint)) - .map(|u| u.txout.value.to_sat()) - .sum(); - - // For exact amounts, ensure we have enough value - if let OnchainSendAmount::ExactRetainingReserve { amount_sats, .. } = send_amount { - // Calculate a fee buffer based on fee rate - // Assume a typical tx with 1 input and 2 outputs (~200 vbytes) - let typical_tx_weight = Weight::from_vb(200).expect("Valid weight"); - let fee_buffer = - fee_rate.fee_wu(typical_tx_weight).expect("Valid fee calculation").to_sat(); - // Use at least 1000 sats as minimum buffer - let min_fee_buffer = fee_buffer.max(1000); - let min_required = amount_sats.saturating_add(min_fee_buffer); - if selected_value < min_required { - log_error!( - self.logger, - "Selected UTXOs have insufficient value. Have: {}sats, Need at least: {}sats", - selected_value, - min_required - ); - return Err(Error::InsufficientFunds); - } - } + .filter(|u| outpoints.contains(&u.outpoint)) + .map(|u| u.txout.value.to_sat()) + .sum(); + + // Note: We don't do a pre-check here for insufficient funds when utxos_to_spend is provided. + // The actual transaction building and reserve check (later in this function) will correctly + // validate if there are sufficient funds, including proper fee calculation that accounts + // for the actual transaction weight and any foreign UTXOs. log_debug!( self.logger, @@ -1165,146 +2454,354 @@ impl Wallet { ); } - // Prepare the tx_builder. We properly check the reserve requirements (again) further down. - const DUST_LIMIT_SATS: u64 = 546; - let mut tx_builder = match send_amount { - OnchainSendAmount::ExactRetainingReserve { amount_sats, .. } => { - let mut tx_builder = locked_wallet.build_tx(); - let amount = Amount::from_sat(amount_sats); - tx_builder.add_recipient(address.script_pubkey(), amount).fee_rate(fee_rate); - tx_builder - }, - OnchainSendAmount::AllRetainingReserve { cur_anchor_reserve_sats } - if cur_anchor_reserve_sats > DUST_LIMIT_SATS => - { - let change_address_info = locked_wallet.peek_address(KeychainKind::Internal, 0); - let balance = locked_wallet.balance(); - let spendable_amount_sats = self - .get_balances_inner(balance, cur_anchor_reserve_sats) - .map(|(_, s)| s) - .unwrap_or(0); - let tmp_tx = { - let mut tmp_tx_builder = locked_wallet.build_tx(); - tmp_tx_builder - .drain_wallet() - .drain_to(address.script_pubkey()) - .add_recipient( - change_address_info.address.script_pubkey(), - Amount::from_sat(cur_anchor_reserve_sats), - ) - .fee_rate(fee_rate); - - // Add manual UTXOs to temporary transaction if specified - if let Some(ref outpoints) = utxos_to_spend { - for outpoint in outpoints { - tmp_tx_builder.add_utxo(*outpoint).map_err(|e| { + // Use primary wallet for building transactions + // We need to handle utxos_to_spend before building, so let's collect that info first + let utxo_info_for_manual = if let Some(ref outpoints) = utxos_to_spend { + log_info!( + self.logger, + "build_transaction_psbt: Processing {} manually selected UTXOs", + outpoints.len() + ); + let wallets_read = self.wallets.lock().unwrap(); + let utxo_infos = self.prepare_outpoints_for_psbt(outpoints, &wallets_read)?; + Some(utxo_infos) + } else { + None + }; + + // Now build the transaction + // We need to handle different cases - some need to collect data before acquiring mutable lock + // For cases where utxos_to_spend is provided, we finish the transaction inside the match arm + // to avoid lifetime issues with the wallets lock + let psbt = if utxos_to_spend.is_some() { + // Handle case where utxos_to_spend is provided - finish transaction inside match arm + let mut wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get_mut(&self.config.address_type).ok_or_else(|| { + log_error!(self.logger, "Primary wallet not found"); + Error::WalletOperationFailed + })?; + let mut tx_builder = wallet.build_tx(); + + // Configure transaction based on send_amount type + match send_amount { + OnchainSendAmount::ExactRetainingReserve { amount_sats, .. } => { + let amount = Amount::from_sat(amount_sats); + tx_builder.add_recipient(address.script_pubkey(), amount).fee_rate(fee_rate); + }, + OnchainSendAmount::AllDrainingReserve => { + // For drain operations with utxos_to_spend, drain to the address + tx_builder.drain_to(address.script_pubkey()).fee_rate(fee_rate); + }, + OnchainSendAmount::AllRetainingReserve { .. } => { + // This case shouldn't typically use utxos_to_spend, but handle it if needed + // We'll drain to the address (reserve will be handled by the reserve check later) + tx_builder.drain_to(address.script_pubkey()).fee_rate(fee_rate); + }, + } + + // Add specified UTXOs using helper + if let Some(ref utxo_infos) = utxo_info_for_manual { + self.add_utxos_to_tx_builder(&mut tx_builder, utxo_infos)?; + tx_builder.manually_selected_only(); + } + + tx_builder.finish().map_err(|e| { + log_error!(self.logger, "Failed to create transaction: {}", e); + e + })? + } else { + // Handle cases where utxos_to_spend is not provided - finish transaction in each match arm + match send_amount { + OnchainSendAmount::ExactRetainingReserve { amount_sats, .. } => { + // For ExactRetainingReserve without utxos_to_spend, manually select UTXOs from all wallets + // using our coin selection algorithm, then add them explicitly to the transaction builder + + // Get all spendable UTXOs from all wallets + let all_available_utxos = self.get_spendable_utxos(channel_manager)?; + + // Get drain script (change address) from primary wallet + // Need to acquire lock for this, but we'll drop it before building tx_builder + let drain_script = { + let wallets_read = self.wallets.lock().unwrap(); + let primary_wallet = wallets_read.get(&self.config.address_type).unwrap(); + primary_wallet + .peek_address(KeychainKind::Internal, 0) + .address + .script_pubkey() + }; + + // Select UTXOs from all wallets using coin selection algorithm + let selected_outpoints = self.select_utxos_with_algorithm( + amount_sats, + all_available_utxos, + fee_rate, + CoinSelectionAlgorithm::LargestFirst, // Use LargestFirst as default + &drain_script, + Some(channel_manager), + )?; + + // Prepare UTXO info for selected UTXOs using helper + let wallets_read = self.wallets.lock().unwrap(); + let utxo_infos = + self.prepare_outpoints_for_psbt(&selected_outpoints, &wallets_read)?; + drop(wallets_read); + + // Re-acquire mutable lock for building transaction + let mut wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get_mut(&self.config.address_type).ok_or_else(|| { + log_error!(self.logger, "Primary wallet not found"); + Error::WalletOperationFailed + })?; + + // Build transaction with selected UTXOs from all wallets + let mut tx_builder = wallet.build_tx(); + let amount = Amount::from_sat(amount_sats); + tx_builder.add_recipient(address.script_pubkey(), amount).fee_rate(fee_rate); + + // Add selected UTXOs using helper + self.add_utxos_to_tx_builder(&mut tx_builder, &utxo_infos)?; + tx_builder.manually_selected_only(); + + // Finish transaction while lock is held + tx_builder.finish().map_err(|e| { + log_error!(self.logger, "Failed to create transaction: {}", e); + e + })? + }, + OnchainSendAmount::AllRetainingReserve { cur_anchor_reserve_sats } + if cur_anchor_reserve_sats > DUST_LIMIT_SATS => + { + let change_address_info = change_address_info_for_all.unwrap(); + // Calculate spendable amount from actual UTXOs that will be used, not from balance + // This ensures the displayed max matches what will actually be sent + let total_utxo_value = if let Some(ref all_utxos) = all_utxos_for_tmp { + all_utxos.iter().map(|utxo| utxo.txout.value.to_sat()).sum::() + } else { + 0 + }; + let tmp_tx = { + // Prepare UTXO info using helper + let wallets_read = self.wallets.lock().unwrap(); + let utxo_infos = if let Some(ref all_utxos) = all_utxos_for_tmp { + self.prepare_utxos_for_psbt(all_utxos, &wallets_read)? + } else { + Vec::new() + }; + drop(wallets_read); + + // Acquire mutable lock for building transaction + let mut wallets = self.wallets.lock().unwrap(); + let wallet = + wallets.get_mut(&self.config.address_type).ok_or_else(|| { + log_error!(self.logger, "Primary wallet not found"); + Error::WalletOperationFailed + })?; + + let mut tmp_tx_builder = wallet.build_tx(); + // Add UTXOs using helper (ignoring errors for temp tx) + for info in &utxo_infos { + if info.is_primary { + if let Err(e) = tmp_tx_builder.add_utxo(info.outpoint) { + log_warn!( + self.logger, + "Failed to add UTXO {:?} to temp tx: {}", + info.outpoint, + e + ); + } + } else { + if let Err(e) = tmp_tx_builder.add_foreign_utxo( + info.outpoint, + info.psbt_input.clone(), + info.weight, + ) { + log_warn!( + self.logger, + "Failed to add foreign UTXO {:?} to temp tx: {}", + info.outpoint, + e + ); + } + } + } + tmp_tx_builder + .drain_to(address.script_pubkey()) + .add_recipient( + change_address_info.address.script_pubkey(), + Amount::from_sat(cur_anchor_reserve_sats), + ) + .fee_rate(fee_rate); + + // Add manual UTXOs to temporary transaction if specified + if let Some(ref outpoints) = utxos_to_spend { + for outpoint in outpoints { + tmp_tx_builder.add_utxo(*outpoint).map_err(|e| { + log_error!( + self.logger, + "Failed to add UTXO {:?} to temp tx: {}", + outpoint, + e + ); + Error::OnchainTxCreationFailed + })?; + } + tmp_tx_builder.manually_selected_only(); + } + + match tmp_tx_builder.finish() { + Ok(psbt) => psbt.unsigned_tx, + Err(err) => { log_error!( self.logger, - "Failed to add UTXO {:?} to temp tx: {}", - outpoint, - e + "Failed to create temporary transaction: {}", + err ); - Error::OnchainTxCreationFailed - })?; + return Err(err.into()); + }, } - tmp_tx_builder.manually_selected_only(); - } + }; - match tmp_tx_builder.finish() { - Ok(psbt) => psbt.unsigned_tx, - Err(err) => { - log_error!( - self.logger, - "Failed to create temporary transaction: {}", - err - ); - return Err(err.into()); - }, - } - }; + // Get primary wallet for cancellation + // Calculate fee manually from transaction inputs and outputs + // This is necessary because tmp_tx may include foreign UTXOs + let estimated_tx_fee = { + let wallets_read = self.wallets.lock().unwrap(); + let mut total_input_value = 0u64; + for txin in &tmp_tx.input { + // Try to find the UTXO in any wallet + let mut found = false; + for wallet in wallets_read.values() { + if let Some(local_utxo) = wallet.get_utxo(txin.previous_output) { + total_input_value += local_utxo.txout.value.to_sat(); + found = true; + break; + } + } + if !found { + log_error!( + self.logger, + "Could not find TxOut for input {:?} in temporary transaction", + txin.previous_output + ); + return Err(Error::OnchainTxCreationFailed); + } + } + let total_output_value: u64 = + tmp_tx.output.iter().map(|txout| txout.value.to_sat()).sum(); + total_input_value.saturating_sub(total_output_value) + }; - let estimated_tx_fee = locked_wallet.calculate_fee(&tmp_tx).map_err(|e| { - log_error!( - self.logger, - "Failed to calculate fee of temporary transaction: {}", - e + // Cancel the transaction to free up any used change addresses + let mut wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get_mut(&self.config.address_type).ok_or_else(|| { + log_error!(self.logger, "Primary wallet not found"); + Error::WalletOperationFailed + })?; + let estimated_tx_fee = Amount::from_sat(estimated_tx_fee); + + // 'cancel' the transaction to free up any used change addresses + wallet.cancel_tx(&tmp_tx); + + // Calculate spendable amount from actual UTXOs: total UTXO value - anchor reserve - fee + // This ensures the displayed max matches what will actually be sent + let estimated_spendable_amount = Amount::from_sat( + total_utxo_value + .saturating_sub(cur_anchor_reserve_sats) + .saturating_sub(estimated_tx_fee.to_sat()), ); - e - })?; - - // 'cancel' the transaction to free up any used change addresses - locked_wallet.cancel_tx(&tmp_tx); - let estimated_spendable_amount = Amount::from_sat( - spendable_amount_sats.saturating_sub(estimated_tx_fee.to_sat()), - ); - - if estimated_spendable_amount < Amount::from_sat(DUST_LIMIT_SATS) { - log_error!(self.logger, - "Unable to send payment without infringing on Anchor reserves. Available: {}sats, estimated fee required: {}sats.", - spendable_amount_sats, + if estimated_spendable_amount < Amount::from_sat(DUST_LIMIT_SATS) { + log_error!(self.logger, + "Unable to send payment without infringing on Anchor reserves. Total UTXO value: {}sats, anchor reserve: {}sats, estimated fee required: {}sats.", + total_utxo_value, + cur_anchor_reserve_sats, estimated_tx_fee, ); - return Err(Error::InsufficientFunds); - } - - let mut tx_builder = locked_wallet.build_tx(); - tx_builder - .add_recipient(address.script_pubkey(), estimated_spendable_amount) - .fee_absolute(estimated_tx_fee); - tx_builder - }, - OnchainSendAmount::AllDrainingReserve - | OnchainSendAmount::AllRetainingReserve { cur_anchor_reserve_sats: _ } => { - let mut tx_builder = locked_wallet.build_tx(); - tx_builder.drain_wallet().drain_to(address.script_pubkey()).fee_rate(fee_rate); - tx_builder - }, - }; + return Err(Error::InsufficientFunds); + } - // Add specified UTXOs if provided - if let Some(outpoints) = utxos_to_spend { - for outpoint in outpoints { - tx_builder.add_utxo(outpoint).map_err(|e| { - log_error!(self.logger, "Failed to add UTXO {:?}: {}", outpoint, e); - Error::OnchainTxCreationFailed - })?; - } + let mut tx_builder = wallet.build_tx(); + tx_builder + .add_recipient(address.script_pubkey(), estimated_spendable_amount) + .fee_absolute(estimated_tx_fee); - // Since UTXOs were specified, only use those - tx_builder.manually_selected_only(); - } + // Finish transaction while lock is held + tx_builder.finish().map_err(|e| { + log_error!(self.logger, "Failed to create transaction: {}", e); + e + })? + }, + OnchainSendAmount::AllDrainingReserve + | OnchainSendAmount::AllRetainingReserve { cur_anchor_reserve_sats: _ } => { + // Prepare UTXO info using helper + let wallets_read = self.wallets.lock().unwrap(); + let utxo_infos = if let Some(ref all_utxos) = all_utxos_for_drain { + self.prepare_utxos_for_psbt(all_utxos, &wallets_read)? + } else { + Vec::new() + }; + drop(wallets_read); + + // Re-acquire mutable lock for building transaction + let mut wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get_mut(&self.config.address_type).ok_or_else(|| { + log_error!(self.logger, "Primary wallet not found"); + Error::WalletOperationFailed + })?; + + // Build transaction with all UTXOs from all wallets + let mut tx_builder = wallet.build_tx(); + // Add UTXOs (ignoring errors for drain operations) + for info in &utxo_infos { + if info.is_primary { + if let Err(e) = tx_builder.add_utxo(info.outpoint) { + log_warn!( + self.logger, + "Failed to add UTXO {:?} from primary wallet: {}", + info.outpoint, + e + ); + } + } else { + if let Err(e) = tx_builder.add_foreign_utxo( + info.outpoint, + info.psbt_input.clone(), + info.weight, + ) { + log_warn!( + self.logger, + "Failed to add foreign UTXO {:?} from another wallet: {}", + info.outpoint, + e + ); + } + } + } + tx_builder.drain_to(address.script_pubkey()).fee_rate(fee_rate); - let psbt = match tx_builder.finish() { - Ok(psbt) => { - log_trace!(self.logger, "Created PSBT: {:?}", psbt); - psbt - }, - Err(err) => { - log_error!(self.logger, "Failed to create transaction: {}", err); - return Err(err.into()); - }, + // Finish transaction while lock is held + tx_builder.finish().map_err(|e| { + log_error!(self.logger, "Failed to create transaction: {}", e); + e + })? + }, + } }; // Check the reserve requirements (again) and return an error if they aren't met. match send_amount { OnchainSendAmount::ExactRetainingReserve { amount_sats, cur_anchor_reserve_sats } => { - let balance = locked_wallet.balance(); + // Get balance and fee using helpers + let (balance, tx_fee_sats) = { + let wallets_read = self.wallets.lock().unwrap(); + let balance = Self::get_aggregate_balance_from_wallets(&wallets_read); + let tx_fee_sats = self.calculate_fee_from_psbt(&psbt, &wallets_read)?; + (balance, tx_fee_sats) + }; let spendable_amount_sats = self .get_balances_inner(balance, cur_anchor_reserve_sats) .map(|(_, s)| s) .unwrap_or(0); - let tx_fee_sats = locked_wallet - .calculate_fee(&psbt.unsigned_tx) - .map_err(|e| { - log_error!( - self.logger, - "Failed to calculate fee of candidate transaction: {}", - e - ); - e - })? - .to_sat(); if spendable_amount_sats < amount_sats.saturating_add(tx_fee_sats) { log_error!(self.logger, "Unable to send payment due to insufficient funds. Available: {}sats, Required: {}sats + {}sats fee", @@ -1316,13 +2813,20 @@ impl Wallet { } }, OnchainSendAmount::AllRetainingReserve { cur_anchor_reserve_sats } => { - let balance = locked_wallet.balance(); + // Get balance from all wallets using helper + let balance = self.get_aggregate_balance(); let spendable_amount_sats = self .get_balances_inner(balance, cur_anchor_reserve_sats) .map(|(_, s)| s) .unwrap_or(0); - let (sent, received) = locked_wallet.sent_and_received(&psbt.unsigned_tx); - let drain_amount = sent - received; + // Get primary wallet for sent_and_received calculation + let wallets_read = self.wallets.lock().unwrap(); + let primary_wallet = wallets_read.get(&self.config.address_type).unwrap(); + let (sent, received) = primary_wallet.sent_and_received(&psbt.unsigned_tx); + drop(wallets_read); + // Use saturating_sub on satoshis to avoid panic if received > sent (shouldn't happen for drain, but be safe) + let drain_amount = + Amount::from_sat(sent.to_sat().saturating_sub(received.to_sat())); if spendable_amount_sats < drain_amount.to_sat() { log_error!(self.logger, "Unable to send payment due to insufficient funds. Available: {}sats, Required: {}", @@ -1335,7 +2839,8 @@ impl Wallet { _ => {}, } - Ok((psbt, locked_wallet)) + // Return just the PSBT - callers will need to lock wallets themselves when signing + Ok(psbt) } pub(crate) fn calculate_transaction_fee( @@ -1349,7 +2854,7 @@ impl Wallet { let fee_rate = fee_rate.unwrap_or_else(|| self.fee_estimator.estimate_fee_rate(confirmation_target)); - let (psbt, locked_wallet) = self.build_transaction_psbt( + let psbt = self.build_transaction_psbt( address, send_amount, fee_rate, @@ -1357,21 +2862,10 @@ impl Wallet { channel_manager, )?; - // Calculate the final fee - let calculated_fee = locked_wallet - .calculate_fee(&psbt.unsigned_tx) - .map_err(|e| { - log_error!(self.logger, "Failed to calculate fee of final transaction: {}", e); - e - })? - .to_sat(); - - log_info!( - self.logger, - "Calculated transaction fee: {}sats for sending to address {}", - calculated_fee, - address - ); + // Calculate the final fee using helper + let wallets_read = self.wallets.lock().unwrap(); + let calculated_fee = self.calculate_fee_from_psbt(&psbt, &wallets_read)?; + drop(wallets_read); Ok(calculated_fee) } @@ -1388,7 +2882,7 @@ impl Wallet { let fee_rate = fee_rate.unwrap_or_else(|| self.fee_estimator.estimate_fee_rate(confirmation_target)); - let (mut psbt, mut locked_wallet) = self.build_transaction_psbt( + let mut psbt = self.build_transaction_psbt( address, send_amount, fee_rate, @@ -1396,30 +2890,103 @@ impl Wallet { channel_manager, )?; - // Sign the transaction - match locked_wallet.sign(&mut psbt, SignOptions::default()) { - Ok(finalized) => { - if !finalized { - return Err(Error::OnchainTxCreationFailed); + // Sign the transaction - each wallet signs its own UTXOs + // This is necessary because each wallet has a different descriptor (Bip44, Bip49, Bip84, Bip86) + // and BDK's sign() method only signs inputs that match the wallet's descriptor + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + // Identify which wallet owns each input in the PSBT + let mut wallet_inputs: HashMap> = HashMap::new(); + let mut unsigned_inputs = Vec::new(); + for (i, txin) in psbt.unsigned_tx.input.iter().enumerate() { + // Find which wallet owns this UTXO + let mut found = false; + for (addr_type, wallet) in wallets.iter() { + if wallet.get_utxo(txin.previous_output).is_some() { + wallet_inputs.entry(*addr_type).or_insert_with(Vec::new).push(i); + found = true; + break; } - }, - Err(err) => { - log_error!(self.logger, "Failed to create transaction: {}", err); - return Err(err.into()); - }, + } + if !found { + unsigned_inputs.push(i); + log_warn!( + self.logger, + "Input {} (UTXO {:?}) not found in any wallet", + i, + txin.previous_output + ); + } } - // Persist the wallet - let mut locked_persister = self.persister.lock().unwrap(); - locked_wallet.persist(&mut locked_persister).map_err(|e| { - log_error!(self.logger, "Failed to persist wallet: {}", e); - Error::PersistenceFailed - })?; + // If we have inputs that aren't in any wallet, that's an error + if !unsigned_inputs.is_empty() { + log_error!( + self.logger, + "Some inputs are not owned by any wallet: {:?}", + unsigned_inputs + ); + return Err(Error::OnchainTxSigningFailed); + } + + // If no inputs found at all, that's also an error + if wallet_inputs.is_empty() { + log_error!(self.logger, "No inputs found in any wallet for transaction"); + return Err(Error::OnchainTxSigningFailed); + } + + // Sign with each wallet that has inputs in the transaction + for (addr_type, input_indices) in wallet_inputs { + let wallet = wallets.get_mut(&addr_type).ok_or_else(|| { + log_error!(self.logger, "Wallet not found for address type {:?}", addr_type); + Error::WalletOperationFailed + })?; + let persister = persisters.get_mut(&addr_type).ok_or_else(|| { + log_error!(self.logger, "Persister not found for address type {:?}", addr_type); + Error::WalletOperationFailed + })?; + + // Create sign options for this wallet + let mut sign_options = SignOptions::default(); + sign_options.trust_witness_utxo = true; + + // Sign inputs owned by this wallet + log_debug!( + self.logger, + "Attempting to sign {} inputs for address type {:?}", + input_indices.len(), + addr_type + ); + match wallet.sign(&mut psbt, sign_options) { + Ok(finalized) => { + // Note: finalized might be false if there are other unsigned inputs, which is expected + // We'll verify all inputs are signed when we call extract_tx() below + log_debug!(self.logger, "Signing completed for address type {:?} (finalized={}, expected {} inputs)", + addr_type, finalized, input_indices.len()); + }, + Err(err) => { + log_error!( + self.logger, + "Failed to sign inputs for address type {:?}: {}", + addr_type, + err + ); + return Err(Error::OnchainTxSigningFailed); + }, + } + + // Persist the wallet after signing + wallet.persist(persister).map_err(|e| { + log_error!(self.logger, "Failed to persist wallet for {:?}: {}", addr_type, e); + Error::PersistenceFailed + })?; + } // Extract the transaction + // Note: psbt.extract_tx() will fail if not all inputs are signed, which is what we want let tx = psbt.extract_tx().map_err(|e| { log_error!(self.logger, "Failed to extract transaction: {}", e); - e + Error::OnchainTxSigningFailed })?; self.broadcaster.broadcast_transactions(&[&tx]); @@ -1461,18 +3028,30 @@ impl Wallet { pub(crate) fn select_confirmed_utxos( &self, must_spend: Vec, must_pay_to: &[TxOut], fee_rate: FeeRate, ) -> Result, ()> { - let mut locked_wallet = self.inner.lock().unwrap(); - debug_assert!(matches!( - locked_wallet.public_descriptor(KeychainKind::External), - ExtendedDescriptor::Wpkh(_) - )); - debug_assert!(matches!( - locked_wallet.public_descriptor(KeychainKind::Internal), - ExtendedDescriptor::Wpkh(_) - )); - - let mut tx_builder = locked_wallet.build_tx(); - tx_builder.only_witness_utxo(); + // Note: Lightning requires the funding OUTPUT to be a witness script (P2WSH). + // Funding INPUTS must be spendable by LDK and are restricted to witness types (P2WPKH/P2TR). + let mut wallets = self.wallets.lock().unwrap(); + let funding_wallet_type = + if matches!(self.config.address_type, AddressType::NativeSegwit | AddressType::Taproot) + { + self.config.address_type + } else if wallets.contains_key(&AddressType::NativeSegwit) { + AddressType::NativeSegwit + } else if wallets.contains_key(&AddressType::Taproot) { + AddressType::Taproot + } else { + log_error!( + self.logger, + "No witness wallet available for channel funding. Primary: {:?}, Monitored: {:?}", + self.config.address_type, + self.config.address_types_to_monitor + ); + return Err(()); + }; + let wallet = wallets.get_mut(&funding_wallet_type).ok_or(())?; + + let mut tx_builder = wallet.build_tx(); + // Use witness wallet UTXOs to ensure compatibility with FundingTxInput constructors. for input in &must_spend { let psbt_input = psbt::Input { @@ -1490,7 +3069,8 @@ impl Wallet { tx_builder.fee_rate(fee_rate); tx_builder.exclude_unconfirmed(); - tx_builder + // Keep wallets locked for tx_details lookup + let result: Result, ()> = tx_builder .finish() .map_err(|e| { log_error!(self.logger, "Failed to select confirmed UTXOs: {}", e); @@ -1499,25 +3079,84 @@ impl Wallet { .input .iter() .filter(|txin| must_spend.iter().all(|input| input.outpoint != txin.previous_output)) - .filter_map(|txin| { - locked_wallet - .tx_details(txin.previous_output.txid) + .map(|txin| { + let prevtx = wallets + .values() + .find_map(|w| w.tx_details(txin.previous_output.txid)) .map(|tx_details| tx_details.tx.deref().clone()) - .map(|prevtx| FundingTxInput::new_p2wpkh(prevtx, txin.previous_output.vout)) + .ok_or_else(|| { + log_error!( + self.logger, + "Failed to find previous transaction for {:?}", + txin.previous_output + ); + })?; + + let vout = txin.previous_output.vout; + let script_pubkey = prevtx + .output + .get(vout as usize) + .map(|output| &output.script_pubkey) + .ok_or_else(|| { + log_error!( + self.logger, + "Missing output {} for previous transaction {:?}", + vout, + txin.previous_output.txid + ); + })?; + + if script_pubkey.is_p2wpkh() { + FundingTxInput::new_p2wpkh(prevtx, vout).map_err(|_| { + log_error!( + self.logger, + "Failed to create P2WPKH funding input for {:?}", + txin.previous_output + ); + }) + } else if script_pubkey.is_p2tr() { + FundingTxInput::new_p2tr_key_spend(prevtx, vout).map_err(|_| { + log_error!( + self.logger, + "Failed to create P2TR funding input for {:?}", + txin.previous_output + ); + }) + } else { + log_error!( + self.logger, + "Unsupported funding input script for {:?}: {:?}", + txin.previous_output, + script_pubkey + ); + Err(()) + } }) - .collect::, ()>>() + .collect(); + result } fn list_confirmed_utxos_inner(&self) -> Result, ()> { - let locked_wallet = self.inner.lock().unwrap(); + // Collect confirmed UTXOs from all wallets + let wallets = self.wallets.lock().unwrap(); let mut utxos = Vec::new(); - let confirmed_txs: Vec = locked_wallet - .transactions() - .filter(|t| t.chain_position.is_confirmed()) - .map(|t| t.tx_node.txid) - .collect(); + let mut all_confirmed_txs = Vec::new(); + let mut all_unspent_utxos = Vec::new(); + + for wallet in wallets.values() { + let confirmed_txs: Vec = wallet + .transactions() + .filter(|t| t.chain_position.is_confirmed()) + .map(|t| t.tx_node.txid) + .collect(); + all_confirmed_txs.extend(confirmed_txs); + all_unspent_utxos.extend(wallet.list_unspent()); + } + + let confirmed_txs_set: std::collections::HashSet<_> = + all_confirmed_txs.into_iter().collect(); let unspent_confirmed_utxos = - locked_wallet.list_unspent().filter(|u| confirmed_txs.contains(&u.outpoint.txid)); + all_unspent_utxos.into_iter().filter(|u| confirmed_txs_set.contains(&u.outpoint.txid)); for u in unspent_confirmed_utxos { let script_pubkey = u.txout.script_pubkey; @@ -1584,15 +3223,11 @@ impl Wallet { }; utxos.push(utxo); }, - Some(version) => { - log_error!(self.logger, "Unexpected witness version: {}", version,); + Some(_version) => { + // Unsupported witness version, skip }, None => { - log_error!( - self.logger, - "Tried to use a non-witness script. This must never happen." - ); - panic!("Tried to use a non-witness script. This must never happen."); + // Non-witness UTXO (Legacy), skip for Lightning operations }, } } @@ -1602,11 +3237,15 @@ impl Wallet { #[allow(deprecated)] fn get_change_script_inner(&self) -> Result { - let mut locked_wallet = self.inner.lock().unwrap(); - let mut locked_persister = self.persister.lock().unwrap(); + // Use primary wallet for change addresses + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); + + let wallet = wallets.get_mut(&self.config.address_type).ok_or(())?; + let persister = persisters.get_mut(&self.config.address_type).ok_or(())?; - let address_info = locked_wallet.next_unused_address(KeychainKind::Internal); - locked_wallet.persist(&mut locked_persister).map_err(|e| { + let address_info = wallet.next_unused_address(KeychainKind::Internal); + wallet.persist(persister).map_err(|e| { log_error!(self.logger, "Failed to persist wallet: {}", e); () })?; @@ -1615,29 +3254,62 @@ impl Wallet { #[allow(deprecated)] pub(crate) fn sign_owned_inputs(&self, unsigned_tx: Transaction) -> Result { - let locked_wallet = self.inner.lock().unwrap(); - let mut psbt = Psbt::from_unsigned_tx(unsigned_tx).map_err(|e| { log_error!(self.logger, "Failed to construct PSBT: {}", e); })?; - for (i, txin) in psbt.unsigned_tx.input.iter().enumerate() { - if let Some(utxo) = locked_wallet.get_utxo(txin.previous_output) { - debug_assert!(!utxo.is_spent); - psbt.inputs[i] = locked_wallet.get_psbt_input(utxo, None, true).map_err(|e| { - log_error!(self.logger, "Failed to construct PSBT input: {}", e); - })?; + + // Track which wallet owns each input so we can sign with the correct wallet + // Each wallet has a different descriptor (Bip44, Bip49, Bip84, Bip86) and can only sign its own inputs + let mut wallet_inputs: HashMap> = HashMap::new(); + + { + // First pass: populate PSBT inputs and track which wallet owns each + let wallets = self.wallets.lock().unwrap(); + for (i, txin) in psbt.unsigned_tx.input.iter().enumerate() { + let mut found = false; + for (addr_type, wallet) in wallets.iter() { + if let Some(utxo) = wallet.get_utxo(txin.previous_output) { + debug_assert!(!utxo.is_spent); + psbt.inputs[i] = wallet.get_psbt_input(utxo, None, true).map_err(|e| { + log_error!(self.logger, "Failed to construct PSBT input: {}", e); + })?; + wallet_inputs.entry(*addr_type).or_insert_with(Vec::new).push(i); + found = true; + break; + } + } + if !found { + log_error!( + self.logger, + "UTXO {:?} not found in any wallet", + txin.previous_output + ); + } } - } + } // Release read lock + // Second pass: sign with each wallet that owns inputs + let mut wallets = self.wallets.lock().unwrap(); let mut sign_options = SignOptions::default(); sign_options.trust_witness_utxo = true; - match locked_wallet.sign(&mut psbt, sign_options) { - Ok(finalized) => debug_assert!(!finalized), - Err(e) => { - log_error!(self.logger, "Failed to sign owned inputs: {}", e); - return Err(()); - }, + for (addr_type, _input_indices) in &wallet_inputs { + if let Some(wallet) = wallets.get_mut(addr_type) { + match wallet.sign(&mut psbt, sign_options.clone()) { + Ok(_finalized) => { + // finalized may be false if there are inputs from other wallets + }, + Err(e) => { + log_error!( + self.logger, + "Failed to sign owned inputs for wallet {:?}: {}", + addr_type, + e + ); + return Err(()); + }, + } + } } match psbt.extract_tx() { @@ -1652,7 +3324,9 @@ impl Wallet { #[allow(deprecated)] fn sign_psbt_inner(&self, mut psbt: Psbt) -> Result { - let locked_wallet = self.inner.lock().unwrap(); + // Use primary wallet for signing + let mut wallets = self.wallets.lock().unwrap(); + let wallet = wallets.get_mut(&self.config.address_type).ok_or(())?; // While BDK populates both `witness_utxo` and `non_witness_utxo` fields, LDK does not. As // BDK by default doesn't trust the witness UTXO to account for the Segwit bug, we must @@ -1660,7 +3334,7 @@ impl Wallet { let mut sign_options = SignOptions::default(); sign_options.trust_witness_utxo = true; - match locked_wallet.sign(&mut psbt, sign_options) { + match wallet.sign(&mut psbt, sign_options) { Ok(_finalized) => { // BDK will fail to finalize for all LDK-provided inputs of the PSBT. Unfortunately // we can't check more fine grained if it succeeded for all the other inputs here, @@ -1693,9 +3367,13 @@ impl Listen for Wallet { } fn block_connected(&self, block: &bitcoin::Block, height: u32) { - let mut locked_wallet = self.inner.lock().unwrap(); + // Apply block to all wallets + let mut wallets = self.wallets.lock().unwrap(); + let mut persisters = self.persisters.lock().unwrap(); - let pre_checkpoint = locked_wallet.latest_checkpoint(); + // Use primary wallet's checkpoint for reorg detection + let primary_wallet = wallets.get(&self.config.address_type).unwrap(); + let pre_checkpoint = primary_wallet.latest_checkpoint(); if pre_checkpoint.height() != height - 1 || pre_checkpoint.hash() != block.header.prev_blockhash { @@ -1707,31 +3385,53 @@ impl Listen for Wallet { ); } - match locked_wallet.apply_block(block, height) { - Ok(()) => { - if let Err(e) = self.update_payment_store(&mut *locked_wallet) { - log_error!(self.logger, "Failed to update payment store: {}", e); + // Apply block to all wallets + for (address_type, wallet) in wallets.iter_mut() { + match wallet.apply_block(block, height) { + Ok(()) => { + if let Some(persister) = persisters.get_mut(address_type) { + if let Err(e) = wallet.persist(persister) { + log_error!( + self.logger, + "Failed to persist wallet {:?}: {}", + address_type, + e + ); + return; + } + } + }, + Err(e) => { + log_error!( + self.logger, + "Failed to apply connected block to wallet {:?}: {}", + address_type, + e + ); return; - } - }, - Err(e) => { + }, + }; + } + + let mut all_txids = std::collections::HashSet::new(); + for wallet in wallets.values() { + for wtx in wallet.transactions() { + all_txids.insert(wtx.tx_node.txid); + } + } + drop(wallets); + drop(persisters); + + if !all_txids.is_empty() { + let txids_vec: Vec = all_txids.into_iter().collect(); + if let Err(e) = self.update_payment_store_for_txids(txids_vec) { log_error!( self.logger, - "Failed to apply connected block to on-chain wallet: {}", + "Failed to update payment store after block connected: {}", e ); - return; - }, - }; - - let mut locked_persister = self.persister.lock().unwrap(); - match locked_wallet.persist(&mut locked_persister) { - Ok(_) => (), - Err(e) => { - log_error!(self.logger, "Failed to persist on-chain wallet: {}", e); - return; - }, - }; + } + } } fn blocks_disconnected(&self, _fork_point_block: BestBlock) { @@ -1874,15 +3574,17 @@ impl SignerProvider for WalletKeysManager { } fn get_destination_script(&self, _channel_keys_id: [u8; 32]) -> Result { - let address = self.wallet.get_new_address().map_err(|e| { - log_error!(self.logger, "Failed to retrieve new address from wallet: {}", e); + // Lightning channels require witness addresses, so use get_new_witness_address + let address = self.wallet.get_new_witness_address().map_err(|e| { + log_error!(self.logger, "Failed to retrieve new witness address from wallet: {}", e); })?; Ok(address.script_pubkey()) } fn get_shutdown_scriptpubkey(&self) -> Result { - let address = self.wallet.get_new_address().map_err(|e| { - log_error!(self.logger, "Failed to retrieve new address from wallet: {}", e); + // Lightning channels require witness addresses, so use get_new_witness_address + let address = self.wallet.get_new_witness_address().map_err(|e| { + log_error!(self.logger, "Failed to retrieve new witness address from wallet: {}", e); })?; match address.witness_program() { @@ -1892,7 +3594,8 @@ impl SignerProvider for WalletKeysManager { _ => { log_error!( self.logger, - "Tried to use a non-witness address. This must never happen." + "Tried to use a non-witness address. This must never happen. Address: {}", + address ); panic!("Tried to use a non-witness address. This must never happen."); }, @@ -1907,10 +3610,15 @@ impl ChangeDestinationSource for WalletKeysManager { let wallet = Arc::clone(&self.wallet); let logger = Arc::clone(&self.logger); Box::pin(async move { + // Lightning channels require witness addresses for change outputs wallet - .get_new_internal_address() + .get_new_witness_internal_address() .map_err(|e| { - log_error!(logger, "Failed to retrieve new address from wallet: {}", e); + log_error!( + logger, + "Failed to retrieve new witness internal address from wallet: {}", + e + ); }) .map(|addr| addr.script_pubkey()) .map_err(|_| ()) diff --git a/src/wallet/persist.rs b/src/wallet/persist.rs index 5c8668937..e07fa59cc 100644 --- a/src/wallet/persist.rs +++ b/src/wallet/persist.rs @@ -10,6 +10,7 @@ use std::sync::Arc; use bdk_chain::Merge; use bdk_wallet::{ChangeSet, WalletPersister}; +use crate::config::AddressType; use crate::io::utils::{ read_bdk_wallet_change_set, write_bdk_wallet_change_descriptor, write_bdk_wallet_descriptor, write_bdk_wallet_indexer, write_bdk_wallet_local_chain, write_bdk_wallet_network, @@ -21,11 +22,14 @@ pub(crate) struct KVStoreWalletPersister { latest_change_set: Option, kv_store: Arc, logger: Arc, + address_type: AddressType, } impl KVStoreWalletPersister { - pub(crate) fn new(kv_store: Arc, logger: Arc) -> Self { - Self { latest_change_set: None, kv_store, logger } + pub(crate) fn new( + kv_store: Arc, logger: Arc, address_type: AddressType, + ) -> Self { + Self { latest_change_set: None, kv_store, logger, address_type } } } @@ -41,6 +45,7 @@ impl WalletPersister for KVStoreWalletPersister { let change_set_opt = read_bdk_wallet_change_set( Arc::clone(&persister.kv_store), Arc::clone(&persister.logger), + persister.address_type, )?; let change_set = match change_set_opt { @@ -91,6 +96,7 @@ impl WalletPersister for KVStoreWalletPersister { &descriptor, Arc::clone(&persister.kv_store), Arc::clone(&persister.logger), + persister.address_type, )?; } } @@ -114,6 +120,7 @@ impl WalletPersister for KVStoreWalletPersister { &change_descriptor, Arc::clone(&persister.kv_store), Arc::clone(&persister.logger), + persister.address_type, )?; } } @@ -135,6 +142,7 @@ impl WalletPersister for KVStoreWalletPersister { &network, Arc::clone(&persister.kv_store), Arc::clone(&persister.logger), + persister.address_type, )?; } } @@ -159,6 +167,7 @@ impl WalletPersister for KVStoreWalletPersister { &latest_change_set.indexer, Arc::clone(&persister.kv_store), Arc::clone(&persister.logger), + persister.address_type, )?; } @@ -168,6 +177,7 @@ impl WalletPersister for KVStoreWalletPersister { &latest_change_set.tx_graph, Arc::clone(&persister.kv_store), Arc::clone(&persister.logger), + persister.address_type, )?; } @@ -177,6 +187,7 @@ impl WalletPersister for KVStoreWalletPersister { &latest_change_set.local_chain, Arc::clone(&persister.kv_store), Arc::clone(&persister.logger), + persister.address_type, )?; } diff --git a/tests/multi_wallet_tests.rs b/tests/multi_wallet_tests.rs new file mode 100644 index 000000000..b4fef8076 --- /dev/null +++ b/tests/multi_wallet_tests.rs @@ -0,0 +1,1882 @@ +// This file is Copyright its original authors, visible in version control history. +// +// This file is licensed under the Apache License, Version 2.0 or the MIT license , at your option. You may not use this file except in +// accordance with one or both of these licenses. + +mod common; + +use std::str::FromStr; + +use bitcoin::{Address, FeeRate}; +use common::{ + generate_blocks_and_wait, premine_and_distribute_funds, setup_bitcoind_and_electrsd, + setup_node, wait_for_tx, TestChainSource, +}; +use ldk_node::bitcoin::Amount; +use ldk_node::config::AddressType; + +// Test that node can be set up with multiple wallets configured +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_setup() { + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&_electrsd); + + // Test with NativeSegwit as primary, monitoring one other type + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + + // Verify node is running + assert!(node.status().is_running); + + // Verify we can generate an address + let addr = node.onchain_payment().new_address().unwrap(); + assert!(!addr.to_string().is_empty()); +} + +// Test that all address types can be used as primary +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_all_address_types_as_primary() { + let address_types = vec![ + AddressType::Legacy, + AddressType::NestedSegwit, + AddressType::NativeSegwit, + AddressType::Taproot, + ]; + + for primary_type in address_types { + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&_electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = primary_type; + config.node_config.address_types_to_monitor = vec![]; + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running); + + // Verify we can generate an address + let addr = node.onchain_payment().new_address().unwrap(); + assert!(!addr.to_string().is_empty()); + + node.stop().unwrap(); + } +} + +// Test that we can generate addresses for monitored address types +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_new_address_for_type() { + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&_electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::Legacy, AddressType::NestedSegwit, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running); + + let address_types = vec![ + AddressType::Legacy, + AddressType::NestedSegwit, + AddressType::NativeSegwit, + AddressType::Taproot, + ]; + + for address_type in address_types { + let addr = node.onchain_payment().new_address_for_type(address_type).unwrap(); + assert!(!addr.to_string().is_empty()); + } +} + +// Test that multiple address types can be configured for monitoring +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_monitoring_config() { + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&_electrsd); + + // Test with different primary types and monitoring configurations + let test_cases = vec![ + (AddressType::Legacy, vec![AddressType::NativeSegwit, AddressType::Taproot]), + (AddressType::NestedSegwit, vec![AddressType::Legacy, AddressType::NativeSegwit]), + ( + AddressType::NativeSegwit, + vec![AddressType::Legacy, AddressType::NestedSegwit, AddressType::Taproot], + ), + (AddressType::Taproot, vec![AddressType::Legacy, AddressType::NestedSegwit]), + ]; + + for (primary_type, monitored_types) in test_cases { + let mut config = common::random_config(true); + config.node_config.address_type = primary_type; + config.node_config.address_types_to_monitor = monitored_types.clone(); + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running); + + // Verify we can generate an address + let addr = node.onchain_payment().new_address().unwrap(); + assert!(!addr.to_string().is_empty()); + + node.stop().unwrap(); + } +} + +// Test that Electrum chain source works with multi-wallet configuration +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_electrum_setup() { + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Electrum(&_electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NestedSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::NativeSegwit, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running); + + // Verify we can generate an address + let addr = node.onchain_payment().new_address().unwrap(); + assert!(!addr.to_string().is_empty()); +} + +// Test that all combinations of primary and monitored types work +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_all_combinations() { + let address_types = vec![ + AddressType::Legacy, + AddressType::NestedSegwit, + AddressType::NativeSegwit, + AddressType::Taproot, + ]; + + for primary_type in &address_types { + for monitored_type in &address_types { + if primary_type == monitored_type { + continue; // Skip same type + } + + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&_electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = *primary_type; + config.node_config.address_types_to_monitor = vec![*monitored_type]; + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running); + + // Verify we can generate an address + let addr = node.onchain_payment().new_address().unwrap(); + assert!(!addr.to_string().is_empty()); + + node.stop().unwrap(); + } + } +} + +// Test that empty monitoring list works (only primary wallet) +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_empty_monitoring() { + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&_electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![]; // Empty monitoring list + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running); + + // Verify we can generate an address + let addr = node.onchain_payment().new_address().unwrap(); + assert!(!addr.to_string().is_empty()); +} + +// Test that monitoring all other types works +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_monitor_all_others() { + let (_bitcoind, _electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&_electrsd); + + let address_types = vec![ + AddressType::Legacy, + AddressType::NestedSegwit, + AddressType::NativeSegwit, + AddressType::Taproot, + ]; + + for primary_type in &address_types { + let mut config = common::random_config(true); + config.node_config.address_type = *primary_type; + // Monitor all other types + config.node_config.address_types_to_monitor = + address_types.iter().copied().filter(|&at| at != *primary_type).collect(); + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running); + + // Verify we can generate an address + let addr = node.onchain_payment().new_address().unwrap(); + assert!(!addr.to_string().is_empty()); + + node.stop().unwrap(); + } +} + +// Test send operation with multi-wallet (should use UTXOs from all wallets) +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_send_operation() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Test with NativeSegwit as primary, monitoring other types + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::Legacy, AddressType::NestedSegwit, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 200_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Generate blocks to confirm the transaction + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + + // Sync wallets to detect the funds + node.sync_wallets().unwrap(); + + // Wait a bit for the chain source to index + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Verify we have funds + let balances = node.list_balances(); + assert!( + balances.spendable_onchain_balance_sats >= fund_amount - 10_000, + "Should have funds available (accounting for fees)" + ); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Test send operation - should work with UTXOs from primary wallet + let send_amount = 50_000; + let txid = + node.onchain_payment().send_to_address(&recipient_addr, send_amount, None, None).unwrap(); + + // Wait for transaction to appear in mempool and verify + wait_for_tx(&electrsd.client, txid).await; + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + + // Verify balance decreased appropriately + let new_balances = node.list_balances(); + assert!( + new_balances.spendable_onchain_balance_sats < balances.spendable_onchain_balance_sats, + "Balance should decrease after send" + ); +} + +// Test send_all_to_address with multi-wallet +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_send_all() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Test with Taproot as primary, monitoring other types + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::Taproot; + config.node_config.address_types_to_monitor = + vec![AddressType::Legacy, AddressType::NativeSegwit]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 300_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Generate blocks to confirm + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + + // Wait a bit for the chain source to index + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Verify we have funds + let balances = node.list_balances(); + assert!( + balances.spendable_onchain_balance_sats >= fund_amount - 10_000, + "Should have funds available" + ); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Test send_all_to_address - should use UTXOs from all wallets + let txid = node.onchain_payment().send_all_to_address(&recipient_addr, true, None).unwrap(); + + // Wait for transaction and verify + wait_for_tx(&electrsd.client, txid).await; + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + + // Verify balance is near zero after sending all + let new_balances = node.list_balances(); + assert!( + new_balances.spendable_onchain_balance_sats < 10_000, + "Balance should be near zero after sending all funds" + ); +} + +// Test RBF operation with multi-wallet +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_rbf() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Test with Legacy as primary, monitoring other types + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::Legacy; + config.node_config.address_types_to_monitor = + vec![AddressType::NativeSegwit, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 250_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Wait for funding to be confirmed and synced + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Send a transaction (this should use UTXOs from all wallets) + let send_amount = 50_000; + let initial_txid = + node.onchain_payment().send_to_address(&recipient_addr, send_amount, None, None).unwrap(); + + // Wait for the transaction to be in mempool and sync wallet + wait_for_tx(&electrsd.client, initial_txid).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Test RBF - bump the fee (should be able to use UTXOs from all wallets for the replacement) + let higher_fee_rate = FeeRate::from_sat_per_kwu(1000); // Higher fee rate + let rbf_txid = node.onchain_payment().bump_fee_by_rbf(&initial_txid, higher_fee_rate).unwrap(); + + // Verify we got a new transaction ID + assert_ne!(initial_txid, rbf_txid, "RBF should create a new transaction"); +} + +// Test CPFP operation with multi-wallet +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_cpfp() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Test with NestedSegwit as primary, monitoring other types + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NestedSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::Legacy, AddressType::NativeSegwit, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 400_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Wait for funding to be confirmed and synced + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Send a transaction + let send_amount = 60_000; + let parent_txid = + node.onchain_payment().send_to_address(&recipient_addr, send_amount, None, None).unwrap(); + + // Wait for the transaction to be in mempool and sync wallet + wait_for_tx(&electrsd.client, parent_txid).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Test CPFP - accelerate the parent transaction (should use UTXOs from all wallets) + let cpfp_fee_rate = FeeRate::from_sat_per_kwu(1500); + let cpfp_txid = + node.onchain_payment().accelerate_by_cpfp(&parent_txid, Some(cpfp_fee_rate), None).unwrap(); + + // Verify we got a new transaction ID + assert_ne!(parent_txid, cpfp_txid, "CPFP should create a new child transaction"); + + // Test calculate_cpfp_fee_rate + let calculated_fee_rate = + node.onchain_payment().calculate_cpfp_fee_rate(&parent_txid, false).unwrap(); + assert!(calculated_fee_rate.to_sat_per_kwu() > 0, "CPFP fee rate should be calculated"); +} + +// Test CPFP works correctly for cross-wallet transactions +// The change from cross-wallet transactions goes to the primary wallet, +// and CPFP should find it there even when the transaction used inputs from multiple wallets +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_cpfp_for_cross_wallet_transaction() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with NativeSegwit as primary, monitoring Legacy + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from both wallet types + let native_addr = node.onchain_payment().new_address().unwrap(); + let legacy_addr = node.onchain_payment().new_address_for_type(AddressType::Legacy).unwrap(); + + // Fund both wallets with amounts that require combining for a larger send + let fund_amount_each = 100_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![legacy_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + + // Generate blocks and sync + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(2)); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Send amount requiring UTXOs from BOTH wallets + // 140,000 > 100,000 (single wallet) but < 200,000 (both wallets) + // Leave enough for change to enable CPFP + let send_amount = 140_000; + let parent_txid = node + .onchain_payment() + .send_to_address(&recipient_addr, send_amount, None, None) + .expect("Cross-wallet send should succeed"); + + // Wait for tx to be in mempool + wait_for_tx(&electrsd.client, parent_txid).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // CPFP should work for cross-wallet transactions because change goes to primary wallet + let cpfp_fee_rate = FeeRate::from_sat_per_kwu(1500); + let cpfp_result = + node.onchain_payment().accelerate_by_cpfp(&parent_txid, Some(cpfp_fee_rate), None); + + // CPFP should succeed - the change output is in the primary wallet + assert!( + cpfp_result.is_ok(), + "CPFP should work for cross-wallet transactions (change goes to primary wallet): {:?}", + cpfp_result.err() + ); + + let cpfp_txid = cpfp_result.unwrap(); + assert_ne!(parent_txid, cpfp_txid, "CPFP should create a new child transaction"); + + // Wait for child tx + wait_for_tx(&electrsd.client, cpfp_txid).await; + + node.stop().unwrap(); +} + +// Test UTXO selection with multi-wallet +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_utxo_selection() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Test with NativeSegwit as primary, monitoring other types + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::Legacy, AddressType::NestedSegwit]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 500_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Wait for funding to be confirmed and synced + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Test list_spendable_outputs - should return UTXOs from all wallets + let spendable_outputs = node.onchain_payment().list_spendable_outputs().unwrap(); + assert!(!spendable_outputs.is_empty(), "Should have spendable outputs from primary wallet"); + + // Test select_utxos_with_algorithm - should consider UTXOs from all wallets + let target_amount = 100_000; + let fee_rate = FeeRate::from_sat_per_kwu(500); + let selected_utxos = node + .onchain_payment() + .select_utxos_with_algorithm( + target_amount, + Some(fee_rate), + ldk_node::CoinSelectionAlgorithm::LargestFirst, + None, + ) + .unwrap(); + + // Should have selected at least one UTXO from the funded wallet + assert!(!selected_utxos.is_empty(), "Should have selected at least one UTXO"); +} + +// Test balance aggregation from all wallets +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_balance_aggregation() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Test with NativeSegwit as primary, monitoring other types + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::Legacy, AddressType::NestedSegwit, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 150_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Test that list_balances aggregates from all wallets + // The method should succeed without error - balance values are valid (may be 0 if sync hasn't completed) + let balances = node.list_balances(); + // Just verify we can access the fields - they're u64 so always valid + let _ = balances.spendable_onchain_balance_sats; + let _ = balances.total_onchain_balance_sats; +} + +// Test get_address_balance with multi-wallet +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_get_address_balance() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 120_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Test get_address_balance - should work for addresses from any wallet + // The unwrap verifies the method succeeds; balance may be 0 if sync hasn't completed + let _balance = node.get_address_balance(&addr.to_string()).unwrap(); +} + +// Test calculate_total_fee with multi-wallet +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_calculate_total_fee() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::Taproot; + config.node_config.address_types_to_monitor = vec![AddressType::NativeSegwit]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 300_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Wait for funding to be confirmed and synced + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Test calculate_total_fee - should consider UTXOs from all wallets + let send_amount = 100_000; + let fee_rate = FeeRate::from_sat_per_kwu(500); + let total_fee = node + .onchain_payment() + .calculate_total_fee(&recipient_addr, send_amount, Some(fee_rate), None) + .unwrap(); + + assert!(total_fee > 0, "Total fee should be calculated"); +} + +// Test send operation with UTXOs from multiple wallets (different address types) +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_send_with_utxos_from_all_wallets() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with NativeSegwit as primary, monitoring Legacy and Taproot + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses for each wallet type by creating a new address (primary) and + // accessing monitored wallets through the node's internal state + // For this test, we'll fund the primary address and verify send works + let primary_addr = node.onchain_payment().new_address().unwrap(); + + // Fund the primary address + let fund_amount = 500_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![primary_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Generate blocks to confirm the transaction + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + + // Sync wallets to detect the funds + node.sync_wallets().unwrap(); + + // Wait a bit for the chain source to index + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Verify we have funds + let balances = node.list_balances(); + assert!( + balances.spendable_onchain_balance_sats >= fund_amount - 10_000, + "Should have funds available (accounting for fees)" + ); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Test send operation - should work with UTXOs from primary wallet + let send_amount = 100_000; + let txid = + node.onchain_payment().send_to_address(&recipient_addr, send_amount, None, None).unwrap(); + + // Verify transaction was created (if send_to_address succeeded, txid is valid) + // The fact that we got here without an error means the transaction was created successfully + + // Wait for transaction to be confirmed + wait_for_tx(&electrsd.client, txid).await; + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + + // Wait a bit for the chain source to index + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Verify balance decreased + let new_balances = node.list_balances(); + assert!( + new_balances.spendable_onchain_balance_sats < balances.spendable_onchain_balance_sats, + "Balance should decrease after send" + ); +} + +// Test send_all_to_address with multi-wallet (should use UTXOs from all wallets) +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_send_all_with_utxos_from_all_wallets() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with Legacy as primary, monitoring NativeSegwit and NestedSegwit + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::Legacy; + config.node_config.address_types_to_monitor = + vec![AddressType::NativeSegwit, AddressType::NestedSegwit]; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let primary_addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 400_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![primary_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Generate blocks to confirm + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + + // Wait a bit for the chain source to index + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Verify we have funds + let balances = node.list_balances(); + assert!( + balances.spendable_onchain_balance_sats >= fund_amount - 10_000, + "Should have funds available" + ); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Test send_all_to_address - should use UTXOs from all wallets + let txid = node.onchain_payment().send_all_to_address(&recipient_addr, true, None).unwrap(); + + // Verify transaction was created (if send_to_address succeeded, txid is valid) + // The fact that we got here without an error means the transaction was created successfully + + // Wait for transaction to be confirmed + wait_for_tx(&electrsd.client, txid).await; + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + + // Wait a bit for the chain source to index + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Verify balance is near zero (reserve may remain) + let new_balances = node.list_balances(); + assert!( + new_balances.spendable_onchain_balance_sats < 10_000, + "Balance should be near zero after sending all funds" + ); +} + +// Test that send operation correctly handles UTXOs from different address types +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_send_handles_different_address_types() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Test with different primary types to ensure all combinations work + let test_cases = vec![ + (AddressType::NativeSegwit, vec![AddressType::Legacy]), + (AddressType::Legacy, vec![AddressType::NativeSegwit, AddressType::Taproot]), + (AddressType::Taproot, vec![AddressType::NestedSegwit]), + ]; + + for (primary_type, monitored_types) in test_cases { + let mut config = common::random_config(true); + config.node_config.address_type = primary_type; + config.node_config.address_types_to_monitor = monitored_types; + + let node = setup_node(&chain_source, config, None); + + // Fund the primary address + let addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 200_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Test send - should work regardless of address type combination + let send_amount = 50_000; + let txid = node + .onchain_payment() + .send_to_address(&recipient_addr, send_amount, None, None) + .expect("Send should succeed for all address type combinations"); + + // Wait for transaction to propagate + wait_for_tx(&electrsd.client, txid).await; + + node.stop().unwrap(); + } +} + +// Test spending UTXOs from multiple wallets (different address types) in a single transaction +// This is the key test for multi-wallet functionality - it verifies that UTXOs from different +// address types can be combined in a single transaction +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_cross_wallet_spending() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with NativeSegwit as primary, monitoring Legacy + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from both wallet types + let native_segwit_addr = node.onchain_payment().new_address().unwrap(); + let legacy_addr = node.onchain_payment().new_address_for_type(AddressType::Legacy).unwrap(); + + // Fund both addresses with amounts that individually aren't enough for a larger send + // NativeSegwit: 100,000 sats + // Legacy: 100,000 sats + // Total: 200,000 sats + let fund_amount_each = 100_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_segwit_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![legacy_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + + // Generate blocks to confirm the transactions + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + + // Sync wallets to detect the funds + node.sync_wallets().unwrap(); + + // Wait for sync to complete + std::thread::sleep(std::time::Duration::from_secs(2)); + + // Verify we have funds in both wallets by checking total balance + let balances = node.list_balances(); + let expected_total = fund_amount_each * 2; + assert!( + balances.total_onchain_balance_sats >= expected_total - 10_000, + "Should have ~{} sats total, but have {} sats", + expected_total, + balances.total_onchain_balance_sats + ); + + // Check per-address-type balances + let native_balance = node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + + assert!( + native_balance.total_sats >= fund_amount_each - 1000, + "NativeSegwit wallet should have ~{} sats, but has {} sats", + fund_amount_each, + native_balance.total_sats + ); + assert!( + legacy_balance.total_sats >= fund_amount_each - 1000, + "Legacy wallet should have ~{} sats, but has {} sats", + fund_amount_each, + legacy_balance.total_sats + ); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Try to send an amount that requires UTXOs from BOTH wallets + // 150,000 sats > 100,000 sats (either wallet alone) but < 200,000 sats (both combined) + let send_amount = 150_000; + let txid = node + .onchain_payment() + .send_to_address(&recipient_addr, send_amount, None, None) + .expect("Cross-wallet spending should succeed - UTXOs from both wallets should be used"); + + // Wait for transaction to propagate + wait_for_tx(&electrsd.client, txid).await; + + // Generate a block to confirm the transaction + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + + // Verify balance decreased appropriately + let new_balances = node.list_balances(); + assert!( + new_balances.total_onchain_balance_sats + < balances.total_onchain_balance_sats - send_amount + 10_000, + "Balance should have decreased by at least the send amount" + ); + + // Verify list_monitored_address_types returns both types + let monitored_types = node.list_monitored_address_types(); + assert!(monitored_types.contains(&AddressType::NativeSegwit), "Should monitor NativeSegwit"); + assert!(monitored_types.contains(&AddressType::Legacy), "Should monitor Legacy"); +} + +// Test that get_balance_for_address_type returns an error for unmonitored types +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_get_balance_for_unmonitored_type() { + let (_bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Setup node with only NativeSegwit as primary, no additional monitoring + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![]; + + let node = setup_node(&chain_source, config, None); + + // Querying the primary type should succeed + let native_balance = node.get_balance_for_address_type(AddressType::NativeSegwit); + assert!(native_balance.is_ok(), "Should be able to get balance for primary type"); + + // Querying an unmonitored type should return an error + let legacy_balance = node.get_balance_for_address_type(AddressType::Legacy); + assert!(legacy_balance.is_err(), "Should return error for unmonitored address type"); + + let taproot_balance = node.get_balance_for_address_type(AddressType::Taproot); + assert!(taproot_balance.is_err(), "Should return error for unmonitored address type"); + + node.stop().unwrap(); +} + +// Test that address_types_to_monitor handles deduplication correctly +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_address_type_deduplication() { + let (_bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Setup node with NativeSegwit as primary, but also include it in monitor list + // This tests that the system handles deduplication correctly + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::NativeSegwit, AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + assert!(node.status().is_running, "Node should start without error"); + + // list_monitored_address_types should return exactly 2 types (not 3) + // NativeSegwit (primary) + Legacy (monitored), with NativeSegwit deduplicated + let monitored_types = node.list_monitored_address_types(); + assert_eq!( + monitored_types.len(), + 2, + "Should have exactly 2 monitored types (deduplicated), got {:?}", + monitored_types + ); + assert!(monitored_types.contains(&AddressType::NativeSegwit), "Should contain NativeSegwit"); + assert!(monitored_types.contains(&AddressType::Legacy), "Should contain Legacy"); + + node.stop().unwrap(); +} + +// Test spending works correctly when a monitored wallet has 0 balance +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_spend_with_empty_monitored_wallet() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Setup node monitoring Legacy, but only fund NativeSegwit + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + + // Only fund the NativeSegwit address (primary) + let native_addr = node.onchain_payment().new_address().unwrap(); + let fund_amount = 200_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Verify NativeSegwit has funds and Legacy has 0 + let native_balance = node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + + assert!( + native_balance.total_sats >= fund_amount - 1000, + "NativeSegwit should have funds: {}", + native_balance.total_sats + ); + assert_eq!(legacy_balance.total_sats, 0, "Legacy wallet should have 0 balance"); + + // Send should work from NativeSegwit only + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + let send_amount = 50_000; + let result = node.onchain_payment().send_to_address(&recipient_addr, send_amount, None, None); + assert!(result.is_ok(), "Send should succeed using only NativeSegwit funds"); + + node.stop().unwrap(); +} + +// Test that multi-wallet state persists correctly across node restarts +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_multi_wallet_persistence_across_restart() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Create config with multi-wallet setup + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from multiple wallet types + let native_addr = node.onchain_payment().new_address().unwrap(); + let legacy_addr = node.onchain_payment().new_address_for_type(AddressType::Legacy).unwrap(); + let taproot_addr = node.onchain_payment().new_address_for_type(AddressType::Taproot).unwrap(); + + // Fund all three wallets + let fund_amount = 100_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![legacy_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![taproot_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Generate blocks and sync + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Verify balances before restart + let native_balance_before = + node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance_before = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + let taproot_balance_before = node.get_balance_for_address_type(AddressType::Taproot).unwrap(); + let total_balance_before = node.list_balances().total_onchain_balance_sats; + + assert!( + native_balance_before.total_sats >= fund_amount - 1000, + "NativeSegwit should have funds before restart" + ); + assert!( + legacy_balance_before.total_sats >= fund_amount - 1000, + "Legacy should have funds before restart" + ); + assert!( + taproot_balance_before.total_sats >= fund_amount - 1000, + "Taproot should have funds before restart" + ); + + // Stop the node + node.stop().unwrap(); + + // Restart the node using stop/start pattern (same node object) + node.start().unwrap(); + + // Sync after restart + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Verify balances are preserved after restart + let native_balance_after = + node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance_after = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + let taproot_balance_after = node.get_balance_for_address_type(AddressType::Taproot).unwrap(); + let total_balance_after = node.list_balances().total_onchain_balance_sats; + + assert_eq!( + native_balance_before.total_sats, native_balance_after.total_sats, + "NativeSegwit balance should persist across restart" + ); + assert_eq!( + legacy_balance_before.total_sats, legacy_balance_after.total_sats, + "Legacy balance should persist across restart" + ); + assert_eq!( + taproot_balance_before.total_sats, taproot_balance_after.total_sats, + "Taproot balance should persist across restart" + ); + assert_eq!( + total_balance_before, total_balance_after, + "Total balance should persist across restart" + ); + + // Verify list_monitored_address_types returns all types after restart + let monitored_types = node.list_monitored_address_types(); + assert!( + monitored_types.contains(&AddressType::NativeSegwit), + "Should still monitor NativeSegwit" + ); + assert!(monitored_types.contains(&AddressType::Legacy), "Should still monitor Legacy"); + assert!(monitored_types.contains(&AddressType::Taproot), "Should still monitor Taproot"); + + node.stop().unwrap(); +} + +// Test cross-wallet spending with three different wallet types +// This ensures UTXOs from multiple different address types can be combined +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_cross_wallet_spending_three_types() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with NativeSegwit as primary, monitoring Legacy and Taproot + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from all three wallet types + let native_addr = node.onchain_payment().new_address().unwrap(); + let legacy_addr = node.onchain_payment().new_address_for_type(AddressType::Legacy).unwrap(); + let taproot_addr = node.onchain_payment().new_address_for_type(AddressType::Taproot).unwrap(); + + // Fund each wallet with an amount that individually isn't enough for a larger send + // NativeSegwit: 80,000 sats + // Legacy: 80,000 sats + // Taproot: 80,000 sats + // Total: 240,000 sats + let fund_amount_each = 80_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![legacy_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![taproot_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + + // Generate blocks to confirm + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(2)); + + // Verify we have funds in all three wallets + let native_balance = node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + let taproot_balance = node.get_balance_for_address_type(AddressType::Taproot).unwrap(); + + assert!( + native_balance.total_sats >= fund_amount_each - 1000, + "NativeSegwit should have ~{} sats", + fund_amount_each + ); + assert!( + legacy_balance.total_sats >= fund_amount_each - 1000, + "Legacy should have ~{} sats", + fund_amount_each + ); + assert!( + taproot_balance.total_sats >= fund_amount_each - 1000, + "Taproot should have ~{} sats", + fund_amount_each + ); + + let total_before = node.list_balances().total_onchain_balance_sats; + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Try to send an amount that requires UTXOs from ALL THREE wallets + // 200,000 sats > 160,000 sats (any two wallets) but < 240,000 sats (all three) + let send_amount = 200_000; + let txid = + node.onchain_payment().send_to_address(&recipient_addr, send_amount, None, None).expect( + "Cross-wallet spending should succeed - UTXOs from all three wallets should be used", + ); + + // Wait for transaction to propagate + wait_for_tx(&electrsd.client, txid).await; + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Verify total balance decreased + let total_after = node.list_balances().total_onchain_balance_sats; + assert!( + total_after < total_before - send_amount + 10_000, + "Balance should have decreased by at least the send amount" + ); + + node.stop().unwrap(); +} + +// Test that send_all correctly drains all wallets +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_send_all_drains_all_wallets() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with Taproot as primary, monitoring NativeSegwit + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::Taproot; + config.node_config.address_types_to_monitor = vec![AddressType::NativeSegwit]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from both wallet types + let taproot_addr = node.onchain_payment().new_address().unwrap(); + let native_addr = + node.onchain_payment().new_address_for_type(AddressType::NativeSegwit).unwrap(); + + // Fund both wallets + let fund_amount = 100_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![taproot_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(fund_amount), + ) + .await; + + // Generate blocks and sync + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Verify both wallets have funds + let taproot_balance = node.get_balance_for_address_type(AddressType::Taproot).unwrap(); + let native_balance = node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + + assert!(taproot_balance.total_sats >= fund_amount - 1000, "Taproot should have funds"); + assert!(native_balance.total_sats >= fund_amount - 1000, "NativeSegwit should have funds"); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Send all - should drain both wallets + let txid = node + .onchain_payment() + .send_all_to_address(&recipient_addr, true, None) + .expect("send_all should succeed"); + + // Wait for transaction + wait_for_tx(&electrsd.client, txid).await; + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Verify BOTH wallets are drained + let taproot_balance_after = node.get_balance_for_address_type(AddressType::Taproot).unwrap(); + let native_balance_after = + node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let total_after = node.list_balances().total_onchain_balance_sats; + + assert!( + taproot_balance_after.spendable_sats < 10_000, + "Taproot wallet should be drained, but has {} sats", + taproot_balance_after.spendable_sats + ); + assert!( + native_balance_after.spendable_sats < 10_000, + "NativeSegwit wallet should be drained, but has {} sats", + native_balance_after.spendable_sats + ); + assert!(total_after < 10_000, "Total balance should be near zero, but is {} sats", total_after); + + node.stop().unwrap(); +} + +// Test that new_address_for_type returns error for unmonitored types +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_new_address_for_unmonitored_type() { + let (_bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Setup node with only NativeSegwit as primary, no additional monitoring + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![]; + + let node = setup_node(&chain_source, config, None); + + // Requesting address for primary type should succeed + let native_result = node.onchain_payment().new_address_for_type(AddressType::NativeSegwit); + assert!(native_result.is_ok(), "Should be able to get address for primary type"); + + // Requesting address for unmonitored type should fail + let legacy_result = node.onchain_payment().new_address_for_type(AddressType::Legacy); + assert!(legacy_result.is_err(), "Should return error for unmonitored address type"); + + let taproot_result = node.onchain_payment().new_address_for_type(AddressType::Taproot); + assert!(taproot_result.is_err(), "Should return error for unmonitored address type"); + + node.stop().unwrap(); +} + +// Test that RBF works correctly when node has multiple wallets configured +// but the transaction only uses UTXOs from a single wallet +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_rbf_single_wallet_input_with_multi_wallet_config() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with NativeSegwit as primary, monitoring Legacy + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from both wallet types + let native_addr = node.onchain_payment().new_address().unwrap(); + let legacy_addr = node.onchain_payment().new_address_for_type(AddressType::Legacy).unwrap(); + + // Fund both wallets - primary wallet gets more funds + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(300_000), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![legacy_addr.clone()], + Amount::from_sat(100_000), + ) + .await; + + // Generate blocks and sync + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(2)); + + // Verify both wallets have funds + let native_balance = node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + assert!(native_balance.total_sats >= 299_000, "NativeSegwit should have funds"); + assert!(legacy_balance.total_sats >= 99_000, "Legacy should have funds"); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Send a smaller amount that can be satisfied by primary wallet alone + // This leaves room for RBF fee bumping + let send_amount = 50_000; + let initial_txid = node + .onchain_payment() + .send_to_address(&recipient_addr, send_amount, None, None) + .expect("Initial send should succeed"); + + // Wait for tx to be in mempool + wait_for_tx(&electrsd.client, initial_txid).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Bump fee via RBF - this works because the original tx only used one wallet's UTXOs + let higher_fee_rate = FeeRate::from_sat_per_kwu(1000); + let rbf_txid = node + .onchain_payment() + .bump_fee_by_rbf(&initial_txid, higher_fee_rate) + .expect("RBF should succeed for single-wallet-input transaction"); + + assert_ne!(initial_txid, rbf_txid, "RBF should create a new transaction"); + + // Wait for replacement + wait_for_tx(&electrsd.client, rbf_txid).await; + + node.stop().unwrap(); +} + +// Test that RBF works correctly for cross-wallet transactions +// Cross-wallet RBF reduces the change output to pay for higher fees +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_rbf_cross_wallet_transaction() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with NativeSegwit as primary, monitoring Legacy + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![AddressType::Legacy]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from both wallet types + let native_addr = node.onchain_payment().new_address().unwrap(); + let legacy_addr = node.onchain_payment().new_address_for_type(AddressType::Legacy).unwrap(); + + // Fund both wallets - need enough for send + change for RBF + let fund_amount_each = 100_000; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![legacy_addr.clone()], + Amount::from_sat(fund_amount_each), + ) + .await; + + // Generate blocks and sync + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(2)); + + // Verify both wallets have funds before the cross-wallet send + let native_balance_before = + node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance_before = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + assert!( + native_balance_before.total_sats >= fund_amount_each - 1000, + "NativeSegwit should have ~100k sats, got {}", + native_balance_before.total_sats + ); + assert!( + legacy_balance_before.total_sats >= fund_amount_each - 1000, + "Legacy should have ~100k sats, got {}", + legacy_balance_before.total_sats + ); + + // Create a recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Send amount requiring UTXOs from BOTH wallets + // 120,000 > 100,000 (single wallet) but < 200,000 (both wallets) + // This leaves ~80,000 for change (enough for RBF fee bump) + let send_amount = 120_000; + let initial_txid = node + .onchain_payment() + .send_to_address(&recipient_addr, send_amount, None, None) + .expect("Initial cross-wallet send should succeed"); + + // Wait for tx to be in mempool + wait_for_tx(&electrsd.client, initial_txid).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Record balance before RBF + let total_before_rbf = node.list_balances().total_onchain_balance_sats; + + // Attempt RBF - cross-wallet RBF should work by reducing the change output + let higher_fee_rate = FeeRate::from_sat_per_kwu(2000); + let rbf_result = node.onchain_payment().bump_fee_by_rbf(&initial_txid, higher_fee_rate); + + // RBF should succeed for cross-wallet transactions + assert!( + rbf_result.is_ok(), + "RBF for cross-wallet transactions should succeed: {:?}", + rbf_result.err() + ); + + let rbf_txid = rbf_result.unwrap(); + assert_ne!(initial_txid, rbf_txid, "RBF should create a new transaction"); + + // Wait for replacement tx and sync + wait_for_tx(&electrsd.client, rbf_txid).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Verify balance decreased (higher fee was paid from change) + let total_after_rbf = node.list_balances().total_onchain_balance_sats; + assert!( + total_after_rbf < total_before_rbf, + "Balance should decrease after RBF due to higher fee (before: {}, after: {})", + total_before_rbf, + total_after_rbf + ); + + // Confirm the replacement transaction + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Final balance check - should have: original funds - send_amount - fees + let final_balance = node.list_balances().total_onchain_balance_sats; + let expected_max = (fund_amount_each * 2) - send_amount; + assert!( + final_balance < expected_max, + "Final balance {} should be less than {} (original - send)", + final_balance, + expected_max + ); + + node.stop().unwrap(); +} + +// Test that sync updates balances for all wallet types +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_sync_updates_all_wallet_balances() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = + vec![AddressType::Legacy, AddressType::NestedSegwit, AddressType::Taproot]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses from all wallet types + let native_addr = node.onchain_payment().new_address().unwrap(); + let legacy_addr = node.onchain_payment().new_address_for_type(AddressType::Legacy).unwrap(); + let nested_addr = + node.onchain_payment().new_address_for_type(AddressType::NestedSegwit).unwrap(); + let taproot_addr = node.onchain_payment().new_address_for_type(AddressType::Taproot).unwrap(); + + // Verify all balances are 0 before funding + assert_eq!(node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap().total_sats, 0); + assert_eq!(node.get_balance_for_address_type(AddressType::Legacy).unwrap().total_sats, 0); + assert_eq!(node.get_balance_for_address_type(AddressType::NestedSegwit).unwrap().total_sats, 0); + assert_eq!(node.get_balance_for_address_type(AddressType::Taproot).unwrap().total_sats, 0); + + // Fund all wallets with different amounts + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![native_addr.clone()], + Amount::from_sat(100_000), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![legacy_addr.clone()], + Amount::from_sat(200_000), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![nested_addr.clone()], + Amount::from_sat(300_000), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![taproot_addr.clone()], + Amount::from_sat(400_000), + ) + .await; + + // Generate blocks + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + + // Sync - this should update all wallet balances + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(2)); + + // Verify all balances updated correctly + let native_balance = node.get_balance_for_address_type(AddressType::NativeSegwit).unwrap(); + let legacy_balance = node.get_balance_for_address_type(AddressType::Legacy).unwrap(); + let nested_balance = node.get_balance_for_address_type(AddressType::NestedSegwit).unwrap(); + let taproot_balance = node.get_balance_for_address_type(AddressType::Taproot).unwrap(); + + assert!( + native_balance.total_sats >= 99_000, + "NativeSegwit should have ~100k, has {}", + native_balance.total_sats + ); + assert!( + legacy_balance.total_sats >= 199_000, + "Legacy should have ~200k, has {}", + legacy_balance.total_sats + ); + assert!( + nested_balance.total_sats >= 299_000, + "NestedSegwit should have ~300k, has {}", + nested_balance.total_sats + ); + assert!( + taproot_balance.total_sats >= 399_000, + "Taproot should have ~400k, has {}", + taproot_balance.total_sats + ); + + // Verify aggregate balance is sum of all + let total = node.list_balances().total_onchain_balance_sats; + let expected_total = native_balance.total_sats + + legacy_balance.total_sats + + nested_balance.total_sats + + taproot_balance.total_sats; + assert_eq!(total, expected_total, "Total should equal sum of all wallet balances"); + + node.stop().unwrap(); +} + +// Test that RBF with additional inputs correctly accounts for input weight in fee calculation. +// When adding inputs to meet a higher fee, the inputs themselves add weight, which requires +// more fee. This test verifies the resulting transaction achieves the target fee rate. +#[tokio::test(flavor = "multi_thread", worker_threads = 1)] +async fn test_rbf_additional_inputs_fee_rate_correctness() { + let (bitcoind, electrsd) = setup_bitcoind_and_electrsd(); + let chain_source = TestChainSource::Esplora(&electrsd); + + // Set up node with NativeSegwit as primary + let mut config = common::random_config(true); + config.node_config.address_type = AddressType::NativeSegwit; + config.node_config.address_types_to_monitor = vec![]; + + let node = setup_node(&chain_source, config, None); + + // Get addresses and fund with multiple small UTXOs + // This ensures RBF will need to add inputs when bumping fee significantly + let addr1 = node.onchain_payment().new_address().unwrap(); + let addr2 = node.onchain_payment().new_address().unwrap(); + let addr3 = node.onchain_payment().new_address().unwrap(); + + // Fund with 3 separate UTXOs of 50k sats each + let utxo_amount = 50_000u64; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr1.clone()], + Amount::from_sat(utxo_amount), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr2.clone()], + Amount::from_sat(utxo_amount), + ) + .await; + premine_and_distribute_funds( + &bitcoind.client, + &electrsd.client, + vec![addr3.clone()], + Amount::from_sat(utxo_amount), + ) + .await; + + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 6).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(2)); + + // Verify we have 3 UTXOs totaling ~150k sats + let initial_balance = node.list_balances().total_onchain_balance_sats; + assert!( + initial_balance >= utxo_amount * 3 - 3000, + "Should have ~150k sats, got {}", + initial_balance + ); + + // Create recipient address + let recipient_addr = Address::from_str("bcrt1qw508d6qejxtdg4y5r3zarvary0c5xw7kygt080") + .unwrap() + .require_network(bitcoin::Network::Regtest) + .unwrap(); + + // Send 30k sats - this should use 1 UTXO and leave ~20k change + // Using a low initial fee rate + let send_amount = 30_000; + let initial_txid = node + .onchain_payment() + .send_to_address(&recipient_addr, send_amount, None, None) + .expect("Initial send should succeed"); + + wait_for_tx(&electrsd.client, initial_txid).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_secs(1)); + + // Now bump fee significantly - this should require adding more inputs + // because the change output won't have enough to cover the increased fee + // Use a very high fee rate to force adding inputs + let high_fee_rate = FeeRate::from_sat_per_kwu(5000); // ~20 sat/vB + + let rbf_result = node.onchain_payment().bump_fee_by_rbf(&initial_txid, high_fee_rate); + + assert!( + rbf_result.is_ok(), + "RBF should succeed even when adding inputs: {:?}", + rbf_result.err() + ); + + let rbf_txid = rbf_result.unwrap(); + assert_ne!(initial_txid, rbf_txid, "RBF should create a new transaction"); + + // Wait for replacement and confirm + wait_for_tx(&electrsd.client, rbf_txid).await; + + // Verify the transaction confirms successfully (which means it paid sufficient fee) + generate_blocks_and_wait(&bitcoind.client, &electrsd.client, 1).await; + node.sync_wallets().unwrap(); + std::thread::sleep(std::time::Duration::from_millis(500)); + + // Final balance check - the transaction should have confirmed + let final_balance = node.list_balances().total_onchain_balance_sats; + let max_expected = initial_balance - send_amount; + + // Balance should be less than initial - send (due to fees) + assert!( + final_balance < max_expected, + "Final balance {} should be less than {} (initial - send amount)", + final_balance, + max_expected + ); + + // The fee paid should be reasonable (not excessively high due to calculation errors) + let fee_paid = initial_balance - final_balance - send_amount; + // With a ~200 vB tx at 20 sat/vB, expect ~4000 sats fee, allow up to 10000 for margin + assert!(fee_paid < 15000, "Fee {} seems too high, possible fee calculation error", fee_paid); + assert!(fee_paid > 1000, "Fee {} seems too low for the requested fee rate", fee_paid); + + node.stop().unwrap(); +}