-
-
Notifications
You must be signed in to change notification settings - Fork 357
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark names #1532
base: master
Are you sure you want to change the base?
Spark names #1532
Conversation
…ation
WalkthroughThis pull request integrates a new "Spark Names" feature throughout the codebase. It adds a new UI dialog for creating spark names, extends Makefiles and header files to include additional source files, and modifies blockchain and consensus parameters to handle spark name data. Enhancements include cryptographic methods for ownership proof, new RPC functions for spark name management, and wallet model methods to generate and validate spark names. Various UI components are updated to support spark name creation, validation, and transactions. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant CreateSparkNamePage
participant WalletModel
participant CSparkWallet
participant Blockchain
User->>CreateSparkNamePage: Open Spark Name dialog (via createSparkName button)
CreateSparkNamePage->>WalletModel: Validate spark name data & update fee
WalletModel->>CSparkWallet: Initiate spark name transaction
CSparkWallet->>Blockchain: Create and broadcast Spark Name transaction
Blockchain-->>CSparkWallet: Confirm transaction creation
CSparkWallet-->>WalletModel: Return transaction details
WalletModel-->>CreateSparkNamePage: Notify success
CreateSparkNamePage-->>User: Display transaction hash and confirmation
sequenceDiagram
participant User
participant RPC
participant Wallet
participant CSparkWallet
User->>RPC: Call registersparkname(sparkName, sparkAddress, years, info)
RPC->>Wallet: Validate spark name and address data
Wallet->>CSparkWallet: Create Spark Name transaction
CSparkWallet-->>Wallet: Return transaction details
Wallet-->>RPC: Provide transaction hash as confirmation
RPC-->>User: Return success response
Suggested reviewers
Poem
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 9
🔭 Outside diff range comments (4)
src/libspark/ownership_proof.h (1)
1-32
: 🛠️ Refactor suggestionNew ownership proof class follows existing coding standards but needs additional documentation.
The
OwnershipProof
class provides serialization and memory calculation functionality, but lacks documentation about its purpose and the meaning of its member variables. Additionally, the class doesn't have constructors or methods for creating or validating proofs.Consider adding:
- Class-level documentation explaining the purpose of this class
- Documentation for the member variables A, t1, t2, t3
- Constructor(s) for proper initialization
- Methods for creating and validating ownership proofs
- Const-correctness for the member variables if they shouldn't be modified after initialization
#ifndef FIRO_LIBSPARK_OWNERSHIP_PROOF_H #define FIRO_LIBSPARK_OWNERSHIP_PROOF_H #include "params.h" namespace spark { /** + * Represents cryptographic proof of ownership for a spark name + * This contains the elements required to prove ownership of a spark address + * without revealing the private keys. + */ class OwnershipProof{ public: + /** + * Default constructor for serialization + */ + OwnershipProof() = default; + + /** + * Constructor with required proof components + */ + OwnershipProof(const GroupElement& a, const Scalar& t_1, const Scalar& t_2, const Scalar& t_3) + : A(a), t1(t_1), t2(t_2), t3(t_3) {} + + /** + * Calculate the memory required for serialization + */ inline std::size_t memoryRequired() const { return Scalar::memoryRequired() * 3 + GroupElement::memoryRequired(); } ADD_SERIALIZE_METHODS; template <typename Stream, typename Operation> inline void SerializationOp(Stream& s, Operation ser_action) { READWRITE(A); READWRITE(t1); READWRITE(t2); READWRITE(t3); } public: + /** + * Group element component of the proof + */ GroupElement A; + /** + * First scalar component of the proof + */ Scalar t1; + /** + * Second scalar component of the proof + */ Scalar t2; + /** + * Third scalar component of the proof + */ Scalar t3; }; } #endifsrc/qt/addressbookpage.cpp (3)
211-217
:⚠️ Potential issuePotential mismatch between item text and enumerator check.
Here, the code comparesui->addressType->currentText()
withAddressTableModel::SparkName
. IfSparkName
is an integer enumerator or a different string than "Spark names", this check will fail silently. Consider comparingitemData()
with the enumerator or comparingcurrentText()
with the actual string "Spark names."- if (ui->addressType->currentText() == AddressTableModel::SparkName) + if (ui->addressType->currentData() == SparkName) return;
294-307
: 🛠️ Refactor suggestionConditional logic for Spark names states is prone to mismatched comparisons.
The variablefSparkNames
is derived fromui->addressType->currentText() == AddressTableModel::SparkName
, which may not match if “SparkName” is an enumerator. UsecurrentData()
or an explicit string check such as"Spark names"
.
385-389
:⚠️ Potential issueInconsistent text checks for “Spark” vs “Spark names.”
Comparing the string literal"Spark names"
here conflicts with the earlier usage ofAddressTableModel::SparkName
. This discrepancy can lead to unexpected behavior if item text differs from the enumerator name. Consider adopting a consistent approach, e.g. always checkcurrentData()
or always compare with the exact user-facing string if that is intended.
🧹 Nitpick comments (27)
src/txmempool.h (1)
527-528
: Enhance comment to clarify map structureWhile the functionality seems correct, the comment could be more descriptive of the data structure's contents.
Consider enhancing the comment to clarify the structure:
- std::map<std::string, std::pair<std::string, uint256>> sparkNames; // used to rule out duplicate names + std::map<std::string, std::pair<std::string, uint256>> sparkNames; // name -> (address, txid) mapping to rule out duplicate namessrc/txmempool.cpp (1)
627-635
: Add cleanup for Spark names in transaction removalThis code adds a cleanup mechanism for spark name transactions when their associated transactions are removed from the mempool. The implementation iterates through the sparkNames map and removes entries that reference the transaction being deleted.
Consider adding a brief comment explaining the purpose of this loop for future maintainers:
// remove all the spark name transactions referencing this tx for (auto it = sparkNames.begin(); it!=sparkNames.end();) { if (it->second.second == tx.GetHash()) { it = sparkNames.erase(it); } else { ++it; } }src/wallet/wallet.h (1)
1105-1110
: Add wallet method for creating Spark name transactionsThis new method provides functionality for creating transactions that register or update Spark names. It handles both the Spark name fee (paid to the development fund) and the transaction fee.
Consider adding a documentation comment above the method to explain its purpose, parameters, and expected behavior:
/** * Creates a transaction that registers or updates a Spark name * @param sparkNameData Data for the Spark name (name, address, etc.) * @param sparkNameFee Fee for registering the Spark name (sent to dev fund) * @param txFee Transaction fee (output parameter) * @param coinControl Optional coin control parameters * @return The created wallet transaction */ CWalletTx CreateSparkNameTransaction( CSparkNameTxData &sparkNameData, CAmount sparkNameFee, CAmount &txFee, const CCoinControl *coinControl = NULL);src/qt/bitcoinaddressvalidator.cpp (1)
110-113
: Spark Name validation could be more robust.While the basic validation for Spark Names checks for the '@' prefix and maximum length, consider implementing additional validation rules such as minimum length, prohibited character sequences, or reserved names to prevent potential abuse.
// check for spark name if (address[0] == '@' && address.size() <= CSparkNameManager::maximumSparkNameLength + 1) + // Check for minimum length to prevent very short names + if (address.size() < 2) // @ plus at least one character + return false; + + // Consider additional validation here: + // - Check for prohibited character sequences + // - Verify no consecutive special characters + // - Check for reserved names return true;src/qt/addresstablemodel.h (1)
94-94
: Consider documenting the purpose of ProcessPendingSparkNameChanges.The function name is descriptive, but adding a comment to explain its specific role in handling Spark Name changes would improve code maintainability.
+ /** + * Process any pending changes to Spark Names, updating the address table + * with new or modified Spark Name entries. + */ void ProcessPendingSparkNameChanges();src/spark/sparkwallet.h (1)
144-149
: Document the Spark Name fee structure.Consider adding documentation to explain the fee structure for Spark Names, particularly how the fee varies based on name length as mentioned in the PR objectives.
+ /** + * Creates a transaction to register a Spark Name. + * + * @param nameData The data for the Spark Name + * @param sparkNamefee The fee for registering the Spark Name (varies by name length, shorter names cost more) + * @param txFee The transaction fee, will be set by the function + * @param coinControl Optional coin control for input selection + * @return The created wallet transaction + */ CWalletTx CreateSparkNameTransaction( CSparkNameTxData &nameData, CAmount sparkNamefee, CAmount &txFee, const CCoinControl *coinControl = NULL);src/rpc/blockchain.cpp (2)
28-28
: Ensure proper include ordering and style consistency.The new include for "../sparkname.h" is not placed according to the standard include ordering convention used in the rest of the file. Typically, local includes come after standard library includes and external dependency includes.
#include "evo/cbtx.h" -#include "../sparkname.h" +#include "sparkname.h"
180-215
: Implementation of getsparknames RPC is well-structured, but lacks validation or paging for large responses.The
getsparknames
function appears to return all spark names in a single response, which could potentially cause issues with very large data sets. Consider implementing pagination for this API endpoint.Also, the output format is a flat array mixing names and addresses which could be confusing. A more structured approach using objects would be clearer.
def getsparknames(const JSONRPCRequest &request) { if (request.fHelp || request.params.size() != 0) { throw std::runtime_error( "getsparknames\n" "\nReturns a list of all Spark names.\n" "\nResult:\n" "[\n" - " \"Name (string)\n" - " \"Address (string)\"\n" + " {\n" + " \"name\": \"string\", (string) The spark name\n" + " \"address\": \"string\" (string) The spark address\n" + " }\n" " ...\n" "]\n" "\nExamples:\n" + HelpExampleCli("getsparknames", "") + HelpExampleRpc("getsparknames", "") ); } LOCK(cs_main); if (!spark::IsSparkAllowed()) { throw JSONRPCError(RPC_WALLET_ERROR, "Spark is not activated yet"); } const Consensus::Params &consensusParams = Params().GetConsensus(); CSparkNameManager *sparkNameManager = CSparkNameManager::GetInstance(); std::set<std::string> sparkNames = sparkNameManager->GetSparkNames(); UniValue result(UniValue::VARR); for (const auto &name : sparkNames) { - result.push_back(name); std::string SparkAddr; - if (sparkNameManager->GetSparkAddress(name, SparkAddr)) - result.push_back(SparkAddr); + UniValue obj(UniValue::VOBJ); + obj.push_back(Pair("name", name)); + if (sparkNameManager->GetSparkAddress(name, SparkAddr)) + obj.push_back(Pair("address", SparkAddr)); + else + obj.push_back(Pair("address", "")); + result.push_back(obj); } return result; }src/libspark/ownership_proof.h (1)
23-28
: Consider using access modifiers to restrict direct access to member variables.Currently, all member variables are public, which might lead to improper usage or modification. Consider making them private and providing accessor methods or using a more explicit design.
-public: +private: GroupElement A; Scalar t1; Scalar t2; Scalar t3; + +public: + // Getters + const GroupElement& GetA() const { return A; } + const Scalar& GetT1() const { return t1; } + const Scalar& GetT2() const { return t2; } + const Scalar& GetT3() const { return t3; }src/spark/sparkwallet.cpp (1)
11-11
: Double-check the necessity of including “sparkname.h”.While it appears that the file relies on Spark name functionality, verifying that including "sparkname.h" is essential helps avoid potential circular dependencies or unnecessary compilation overhead.
src/libspark/keys.cpp (3)
3-3
: Transcript usage note.Including
"transcript.h"
is fine. Just ensure that transcript-based proof logic is indeed used by this file.
247-259
: Maintain ordering of transcript additions.You’re adding “G”, “F”, “H”, “A”, “m”, “d”, “Q1”, “Q2” in that order. If hashing or ordering is crucial to proof correctness, be sure all references rely on exactly the same order for reproducible challenges.
292-310
: Decouple proof failure from exceptions if needed.Currently,
verify_own()
throws on bad proofs. Consider returning a boolean or specialized status for “bad proof” vs. using an exception. This can improve clarity in batch-verification scenarios.src/libspark/keys.h (3)
87-87
: Add documentation for the challenge method.The
challenge
method lacks documentation explaining its purpose, parameters, and return value. Since this is a cryptographic method used in ownership proofs, clear documentation is essential for security and maintainability.+/** + * Computes a cryptographic challenge based on provided parameters. + * @param m The scalar message/nonce + * @param A First group element for challenge computation + * @param H Second group element for challenge computation + * @return Scalar result of the challenge computation + */ Scalar challenge(const Scalar& m, const GroupElement& A, const GroupElement& H) const;
88-92
: Add documentation for the ownership proof generation method.The
prove_own
method lacks documentation explaining its purpose, parameters, and behavior. Proper documentation is essential for secure implementation and usage.+/** + * Generates a cryptographic proof of address ownership. + * @param m The scalar message/nonce used in the proof + * @param spend_key The private spend key proving ownership + * @param incomingViewKey The incoming view key for verification + * @param proof Output parameter that will store the generated proof + */ void prove_own(const Scalar& m, const SpendKey& spend_key, const IncomingViewKey& incomingViewKey, OwnershipProof& proof) const;
96-98
: Consider a more efficient comparison method for addresses.The current implementation of
operator<
compares the encoded string representations of addresses, which could be inefficient for frequent comparisons. Consider implementing a more efficient comparison that directly compares the internal representation (like comparing the group elements and diversifier) for better performance.bool operator < (const Address &other) const { - return encode(0) < other.encode(0); + // Compare internal representation directly for efficiency + if (d != other.d) return d < other.d; + if (!(Q1 == other.Q1)) return Q1 < other.Q1; // Assuming GroupElement has operator< defined + return Q2 < other.Q2; }Note: This assumes
GroupElement
has proper comparison operators defined. If not, you might need a different approach.src/validation.cpp (1)
1637-1638
: Added code to store spark name data in mempoolThis code stores valid spark name data in the memory pool when a transaction is accepted, making it available for conflict detection in future transactions. The implementation properly:
- Only adds non-empty names
- Uses case-insensitive storage via
ToUpper
to prevent duplicate registrations with different case- Associates the name with both the spark address and transaction hash
Consider adding a comment explaining the structure and purpose of
pool.sparkNames
for better maintainability. Something like:+ // Store the spark name in mempool to detect future conflicts + // Map format: uppercased name -> (spark address, transaction hash) if (!sparkNameData.name.empty()) pool.sparkNames[CSparkNameManager::ToUpper(sparkNameData.name)] = {sparkNameData.sparkAddress, hash};src/libspark/test/ownership_test.cpp (1)
1-110
: Consider adding deterministic test vectorsWhile the current random-based tests are good for general validation, consider adding tests with hardcoded values for critical paths. This would make test failures more reproducible and could specifically target edge cases.
For cryptographically sensitive code, it's valuable to have tests with predetermined inputs and expected outputs to ensure consistent behavior across all environments and to make debugging easier. Consider adding at least one test case with fixed seed or predefined values.
🧰 Tools
🪛 Cppcheck (2.10-2)
[error] 7-7: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
src/qt/walletmodel.cpp (1)
1602-1656
: Consider extracting common transaction preparation logicThere's significant code duplication across the various transaction preparation methods (prepareTransaction, prepareJoinSplitTransaction, prepareMintSparkTransaction, prepareSpendSparkTransaction, and now prepareSparkNameTransaction).
Consider refactoring shared elements like error handling, fee calculation, and change position detection into helper methods to reduce duplication and ensure consistent handling across all transaction types.
src/qt/createsparknamepage.h (2)
37-38
: Add documentation for CreateSparkNameTransaction methodThis method appears to be a key part of the implementation, but lacks documentation about its role, parameters, and return value.
Add a comment describing:
- What the method does
- The meaning of each parameter
- What the boolean return value indicates
- Any side effects or state changes
For example:
/** * Creates and commits a transaction to register a Spark name. * * @param name The Spark name to register * @param address The Spark address to associate with the name * @param numberOfYears Duration for which the name should be registered * @param additionalInfo Additional metadata to store with the name * @return true if transaction was created and broadcast successfully, false otherwise */ bool CreateSparkNameTransaction(const std::string &name, const std::string &address, int numberOfYears, const std::string &additionalInfo);
39-40
: Document fee calculation logicThe
updateFee()
method likely handles an important aspect of the feature - calculating registration fees based on name characteristics. This should be documented.Add a comment explaining:
- How fees are calculated
- What factors influence the fee (e.g., name length, duration)
- How the fee is displayed to the user
This is particularly important since the PR description mentions that shorter names have higher fees, which is a non-obvious business rule that should be clearly documented.
src/qt/addresstablemodel.cpp (1)
390-399
: Inconsistent label casing for 'spark' vs. 'spark name'.
While functionally sound, consider standardizing to maintain consistent capitalization across the UI (e.g., "Spark" / "Spark Name").- return "spark"; + return "Spark"; ... - return "spark name"; + return "Spark Name";src/qt/addressbookpage.cpp (2)
121-126
: Addition of "Spark names" to the addressType combo box.
Looks good. Just ensure that SparkName is handled uniformly throughout the code, especially if the index ordering changes. Hardcoding or relying on a specific combo box position may lead to brittle code.
408-421
: Hardcoded index check (idx == 2
) is fragile.
If new address types are inserted or the ordering changes, this logic will break. Instead, compareitemData(idx)
or check for the enumerator that corresponds to SparkName.- if (idx == 2) { + if (ui->addressType->itemData(idx).toInt() == SparkName) { // ... }src/sparkname.h (2)
1-2
: Typographical mismatch in header guards.
It reads#ifndef FIFO_SPARKNAME_H
but the#endif
line referencesFIRO_SPARKNAME_H
. Consider aligning them to a consistent symbol:-#ifndef FIFO_SPARKNAME_H +#ifndef FIRO_SPARKNAME_H #define FIRO_SPARKNAME_H ... -#endif // FIRO_SPARKNAME_H +#endif // FIRO_SPARKNAME_H
95-96
: Concurrent access to sparkNames and sparkNameAddresses.
These containers are modified in multiple places (AddBlock/RemoveBlock, etc.). It's unclear if higher-level locks ensure thread safety. If not, consider adding proper locking to avoid data races.src/sparkname.cpp (1)
149-164
: Mixed reference to consensusParams fields and potential naming confusion.
Line 152 referencesnSparkNamesStartBlock
, but the error message usesconsensusParams.nSparkStartBlock
. Confirm whether these parameters differ. If it’s intended to read fromnSparkNamesStartBlock
, the error message should reflect that.-return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkStartBlock)); +return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkNamesStartBlock));
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (45)
contrib/bitcoin-qt.pro
(1 hunks)src/Makefile.am
(3 hunks)src/Makefile.qt.include
(4 hunks)src/Makefile.test.include
(1 hunks)src/chain.h
(4 hunks)src/chainparams.cpp
(5 hunks)src/consensus/params.h
(1 hunks)src/libspark/keys.cpp
(2 hunks)src/libspark/keys.h
(2 hunks)src/libspark/ownership_proof.h
(1 hunks)src/libspark/test/ownership_test.cpp
(1 hunks)src/libspark/util.h
(1 hunks)src/primitives/transaction.h
(1 hunks)src/qt/addressbookpage.cpp
(9 hunks)src/qt/addressbookpage.h
(2 hunks)src/qt/addresstablemodel.cpp
(9 hunks)src/qt/addresstablemodel.h
(2 hunks)src/qt/bitcoinaddressvalidator.cpp
(2 hunks)src/qt/createsparknamepage.cpp
(1 hunks)src/qt/createsparknamepage.h
(1 hunks)src/qt/forms/createsparkname.ui
(1 hunks)src/qt/forms/receivecoinsdialog.ui
(1 hunks)src/qt/receivecoinsdialog.cpp
(6 hunks)src/qt/receivecoinsdialog.h
(1 hunks)src/qt/sendcoinsdialog.cpp
(1 hunks)src/qt/sendcoinsentry.cpp
(1 hunks)src/qt/walletmodel.cpp
(2 hunks)src/qt/walletmodel.h
(2 hunks)src/rpc/blockchain.cpp
(3 hunks)src/rpc/client.cpp
(1 hunks)src/secp256k1/src/cpp/Scalar.cpp
(6 hunks)src/spark/sparkwallet.cpp
(3 hunks)src/spark/sparkwallet.h
(3 hunks)src/spark/state.cpp
(7 hunks)src/spark/state.h
(2 hunks)src/sparkname.cpp
(1 hunks)src/sparkname.h
(1 hunks)src/txdb.cpp
(1 hunks)src/txmempool.cpp
(1 hunks)src/txmempool.h
(1 hunks)src/ui_interface.h
(2 hunks)src/validation.cpp
(4 hunks)src/wallet/rpcwallet.cpp
(3 hunks)src/wallet/wallet.cpp
(2 hunks)src/wallet/wallet.h
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- src/qt/forms/createsparkname.ui
🧰 Additional context used
🪛 Cppcheck (2.10-2)
src/libspark/test/ownership_test.cpp
[error] 7-7: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
src/spark/state.cpp
[error] 811-811: Syntax Error
(internalAstError)
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: Build for Windows
- GitHub Check: Build for Linux
- GitHub Check: Build for macOS
🔇 Additional comments (107)
contrib/bitcoin-qt.pro (1)
22-22
: Integration of spark name creation UI formThe addition of the
createsparkname.ui
form file is properly formatted and follows the project's UI file organization pattern. This UI form will support the new spark name creation feature.src/libspark/util.h (1)
33-33
: Added ownership transcript label for cryptographic operationsThis new constant follows the naming convention pattern of other transcript labels and will be used for ownership proofs in the spark name system.
src/Makefile.test.include (1)
95-95
: Added ownership test file for proper test coverageThe new test file
ownership_test.cpp
is properly integrated into the build system, ensuring that the new spark name ownership functionality is adequately tested.src/spark/state.h (2)
13-13
: Added sparkname.h include for supporting spark name functionalityThe inclusion of this header file enables the use of spark name-related data structures within the state system.
32-34
:✅ Verification successful
Added spark names storage to transaction info class
The new
sparkNames
map is properly added to theCSparkTxInfo
class to store and manage spark name data. It follows the class's naming conventions and is appropriately documented.
🏁 Script executed:
#!/bin/bash # Description: Check for usage of the sparkNames map in related files to verify proper implementation # Check how the spark names map is used in ConnectBlockSpark (mentioned in the AI summary) rg -A 5 "sparkNames" src/spark/state.cpp # Check how the CSparkNameTxData structure is defined rg -A 10 "CSparkNameTxData" src/sparkname.hLength of output: 3639
Spark Names Storage Addition Verified
The verification confirms that:
- The
sparkNames
map insrc/spark/state.h
is correctly integrated into theCSparkTxInfo
class.- Its usage in
src/spark/state.cpp
(e.g., checking for emptiness, iteration, and conflict resolution viaCSparkNameManager::IsInConflict
) aligns with expected behavior.- The
CSparkNameTxData
structure insrc/sparkname.h
is properly defined and supports the necessary spark name operations.Approved as is.
src/qt/forms/receivecoinsdialog.ui (1)
208-226
: UI button added to support new Spark name creationThe button looks well-implemented and properly configured with appropriate properties and tooltip. It's correctly placed within the horizontal layout of action buttons.
src/rpc/client.cpp (1)
204-205
: RPC parameter conversion entry added correctlyThe new entry for "registersparkname" properly identifies that the third parameter (index 2) requires conversion, which aligns with the Spark names feature being implemented.
src/qt/receivecoinsdialog.h (1)
87-87
: New slot added for Spark name creationThe slot method is correctly placed among other UI-related private slots and follows the project's naming conventions.
src/txdb.cpp (1)
432-434
: LGTM: Addition of spark name tracking to block index loadingThe code correctly loads the spark name data from the disk block index and assigns it to the block index object in memory. This change ensures that the spark name additions and removals are properly tracked when loading the blockchain from disk.
src/Makefile.am (3)
247-247
: LGTM: Added sparkname.h to core headersThe header file is correctly added to the Bitcoin Core headers list, which ensures it will be properly included in the build process.
428-428
: LGTM: Added sparkname.cpp to server sourcesThe sparkname.cpp source file is correctly added to the server sources, which ensures that the functionality will be compiled and linked into the server component.
689-689
: LGTM: Added ownership_proof.h to libspark sourcesThis header is correctly added to the libspark sources. Based on the naming, this likely provides cryptographic proof of ownership functionality for spark names.
src/primitives/transaction.h (1)
39-40
: LGTM: New transaction type for spark name aliasesAdded a new transaction type enum value for TRANSACTION_ALIAS, which will be used to identify transactions related to spark name registration. The value is properly incremented from the previous enum value.
src/consensus/params.h (1)
267-269
:✅ Verification successful
Verify the fee structure implementation for spark names
The code correctly adds consensus parameters for the spark names feature:
nSparkNamesStartBlock
determines when the feature is activatednSparkNamesFee
is an array of 21 different fee levels, presumably for different name lengthsBased on the PR description, shorter names should incur higher fees. Ensure that the implementation correctly applies the fee structure based on name length and that the array is indexed appropriately.
🏁 Script executed:
#!/bin/bash # Description: Check how the spark name fees are used in the codebase echo "Checking how spark name fees are applied in the code:" rg -A 10 "nSparkNamesFee" --type cpp echo "" echo "Looking for spark name fee calculation logic:" rg -A 10 "calculateSparkNameFee|getSparkNameFee" --type cppLength of output: 6097
Action: Confirm correct fee structure implementation for spark names.
The implementation appears consistent and correct based on our review:
- Activation Parameter:
nSparkNamesStartBlock
is used to determine when the spark names feature is activated.- Fee Calculation: Across the codebase (e.g., in
src/sparkname.cpp
,src/qt/createsparknamepage.cpp
, andsrc/wallet/rpcwallet.cpp
), fees are calculated inline by indexing thenSparkNamesFee
array with the spark name’s length (using either.size()
or.length()
). This approach aligns with the intended design where shorter names incur higher fees.- Consensus Parameter Setup: The consensus parameters, including the 21-level fee structure (from
standardSparkNamesFee
), are correctly assigned insrc/chainparams.cpp
.Note:
Ensure that the values instandardSparkNamesFee
enforce the desired fee progression (i.e., higher fees for shorter names) and that name lengths are appropriately bounded elsewhere to avoid out-of-range indexing.src/qt/sendcoinsentry.cpp (1)
171-174
:✅ Verification successful
Add support for Spark name validation in address field
This change allows the UI to accept addresses starting with "@" symbol (Spark names) as valid inputs for payment destinations. The validation is basic, only checking for the "@" prefix and a maximum length constraint.
The code simply marks addresses beginning with "@" as valid based only on length constraints, without deeper validation of the Spark name format or existence. Ensure that further validation occurs elsewhere in the code path before transaction creation:
🏁 Script executed:
#!/bin/bash # Check where Spark name validation is performed rg -A 3 -B 3 "validateSparkName|CSparkNameManager::validate"Length of output: 1605
Spark Name UI Validation: Preliminary Check with Deeper Validation Downstream
The UI now accepts inputs starting with "@" as valid based solely on a length check, allowing Spark names to pass the initial validation in the SendCoinsEntry UI. As verified, additional, more robust validation is implemented later in the code (in methods such as
WalletModel::validateSparkNameData
in src/qt/walletmodel.cpp and in src/qt/createsparknamepage.cpp). Ensure that the comprehensive validation downstream correctly handles the Spark name format and other conditions before transaction submission.src/ui_interface.h (2)
19-19
: Forward declaration of new Spark name data structureThis forward declaration introduces a new class
CSparkNameBlockIndexData
that will be used to track Spark name data throughout the codebase.The forward declaration is appropriate and follows good C++ practices.
125-128
: Add notification signals for Spark name changesTwo new signals have been added to notify the UI when Spark names are added or removed. These signals will allow the wallet UI to stay synchronized with Spark name state changes in the blockchain.
The signal implementation follows the established pattern in the codebase for notification mechanisms, making it consistent with the existing architecture.
src/chain.h (4)
25-25
: Include of the new header file looks good.This file introduces the required definitions for spark name functionality.
272-275
: Well-documented data structures for spark name tracking.These maps provide a clean way to track both added/extended spark names and removed spark names within each block. The use of
CSparkNameBlockIndexData
allows storing additional metadata alongside each name.
319-320
: Properly clearing spark name maps on SetNull().These lines ensure the new maps are cleared when a block index is reset to its null state, maintaining consistency with the rest of the class.
596-599
: Properly gated serialization of spark name data.This serialization logic is correctly gated behind both a serialization type check and a height check to ensure backward compatibility. Only blocks at or above the
nSparkNamesStartBlock
will include the spark name data in serialization.src/qt/sendcoinsdialog.cpp (1)
311-322
: Well-implemented support for spark name resolution.This code extends the send coins functionality to support spark names by:
- Checking for addresses starting with "@" symbol
- Extracting the spark name
- Attempting to resolve it to an actual address
- Providing appropriate error feedback if the resolution fails
The implementation is clean and handles user feedback appropriately.
src/qt/addressbookpage.h (1)
48-50
: Addition of SparkName to AddressTypeEnum is appropriate.The SparkName enum value is properly placed after the existing values, ensuring compatibility with any code that might use numeric values of this enum.
src/Makefile.qt.include (4)
122-123
: New UI form for creating spark names added correctly.The
createsparkname.ui
file is appropriately added to the QT_FORMS_UI section, ensuring it will be properly processed by the build system.
185-186
: MOC file for the new class is correctly added.The
moc_createsparknamepage.cpp
file is added to the QT_MOC_CPP section, ensuring the Qt Meta-Object Compiler will process the new class.
269-270
: Header file for the new class is correctly added.The
createsparknamepage.h
file is added to the BITCOIN_QT_H section, ensuring it will be included in the build.
460-461
: Implementation file for the new class is correctly added.The
createsparknamepage.cpp
file is added to the BITCOIN_QT_WALLET_CPP section, ensuring it will be compiled as part of the wallet functionality.src/qt/receivecoinsdialog.cpp (6)
17-17
: Proper integration of new Spark Name feature in the UI.The inclusion of the
createsparknamepage.h
header properly integrates the new Spark Name functionality into the UI components.
56-56
: Good UI visibility control based on address type.The conditional display logic for the
createSparkNameButton
ensures the button is only visible when the appropriate address type (Spark) is selected, providing a clean user experience.Also applies to: 59-59
90-90
: Proper event connection for the Spark Name creation button.The connection between the button click event and the
createSparkName
method is correctly implemented using Qt's signal-slot mechanism.
141-141
: Ensure consistent UI state in clear method.Good practice to reset the visibility of the Spark Name button when clearing the dialog, maintaining a consistent UI state.
332-332
: Consistent handling of UI element visibility in displayCheckBox method.The method correctly toggles the visibility of the
createSparkNameButton
based on the address type selected in the combo box, keeping the UI consistent with the application state.Also applies to: 335-335
468-473
:❓ Verification inconclusive
Implementation of Spark Name creation dialog.
The implementation of the
createSparkName
method appropriately creates a new dialog for Spark Name creation, sets up the necessary model, and displays it to the user. Make sure to verify that the dialog is properly cleaned up after use, which appears to be handled by theQt::WA_DeleteOnClose
attribute.
🏁 Script executed:
#!/bin/bash # Let's confirm proper cleanup of the dialog by checking for any memory management issues grep -r "CreateSparkNamePage" . --include="*.cpp" | grep -v "new CreateSparkNamePage" | grep "delete"Length of output: 99
Action Required: Please Verify Dialog Cleanup Mechanism
The implementation of
createSparkName
correctly creates the dialog, assigns the model, and usesQt::WA_DeleteOnClose
to ensure it gets automatically deleted after closing. The absence of explicitdelete
calls in our grep search is expected with this Qt attribute. However, because the automated search did not provide additional confirmation, please manually verify that the dialog is indeed being cleaned up after use (for example, by confirming that no memory leaks or lingering instances occur during runtime).
- Ensure that
CreateSparkNamePage
’s destructor and any associated cleanup logic correctly handle all allocated resources.- Manually test the dialog’s lifecycle in the application to ensure that the use of
Qt::WA_DeleteOnClose
behaves as expected.src/qt/bitcoinaddressvalidator.cpp (1)
70-73
: Extended validation for Spark Name format.The address validation is properly extended to support the Spark Name format, allowing '@' as the first character and special characters ('.', '-', '_') within Spark Names. This implementation ensures backwards compatibility while adding support for the new format.
src/qt/addresstablemodel.h (3)
60-60
: Addition of SparkName type constant for address table model.The new
SparkName
constant is consistently added alongside other address type constants, maintaining the existing code pattern.
96-96
: Good practice adding const qualifier to accessor method.The addition of the
const
qualifier togetWalletModel()
is a good practice as it indicates the method doesn't modify the object's state. This improves code safety and allows the method to be called from other const methods.
93-96
:✅ Verification successful
Verify integration with existing wallet model functionality.
Ensure that the new
ProcessPendingSparkNameChanges()
method properly integrates with the existing wallet model functionality and that calling applications correctly handle the Spark Name type.
🏁 Script executed:
#!/bin/bash # Check where the ProcessPendingSparkNameChanges method is implemented and called grep -r "ProcessPendingSparkNameChanges" --include="*.cpp" . # Check how the SparkName constant is used grep -r "SparkName" --include="*.cpp" .Length of output: 19747
Integration of
ProcessPendingSparkNameChanges()
VerifiedThe new
ProcessPendingSparkNameChanges()
method is properly implemented in src/qt/addresstablemodel.cpp and is correctly invoked from src/qt/addressbookpage.cpp. The grep results confirm that the Spark Name functionality—including usage of the SparkName constant—is consistently integrated across the wallet model and related UI components. Please ensure that existing tests continue to cover these integration points, but no additional changes appear necessary at this time.src/spark/sparkwallet.h (4)
14-14
: Appropriate inclusion of Spark Name header.The addition of the sparkname.h header is necessary to support the new Spark Name functionality in the Spark wallet.
48-48
: Type-safe overload for isAddressMine method.Adding an overload of
isAddressMine
that accepts aspark::Address
directly improves type safety and potentially performance by avoiding unnecessary string conversions.
144-149
: Transaction creation for Spark Names.The addition of
CreateSparkNameTransaction
method provides a clean interface for creating transactions specifically for Spark Names. The method parameters include all necessary information: name data, fees, and optional coin control.
144-149
:❓ Verification inconclusive
Verify transaction fee handling.
Ensure that the transaction fee handling in
CreateSparkNameTransaction
follows the same approach as other transaction creation methods in the wallet. The method should validate that the fee is sufficient for the transaction to be accepted by the network, especially considering that Spark Names with shorter lengths incur higher registration costs as mentioned in the PR objectives.
🏁 Script executed:
#!/bin/bash # Check the implementation of CreateSparkNameTransaction to ensure proper fee handling grep -A 30 "CreateSparkNameTransaction" --include="sparkwallet.cpp" src/Length of output: 97
I'm noticing that the previous command returned an error due to how the directory was passed to grep. To further verify the fee handling within
CreateSparkNameTransaction
, please run a revised command that recursively searches for the implementation in thesparkwallet.cpp
file. For example, try:#!/bin/bash grep -R -A 30 "CreateSparkNameTransaction(" --include="sparkwallet.cpp" src/This should retrieve the surrounding implementation code so the fee handling logic can be thoroughly reviewed against other wallet transaction methods. Once you have the output, please verify that the fee validation is consistent with the intended approach.
src/rpc/blockchain.cpp (1)
1745-1746
: Well-structured RPC command registration.The registration of the new RPC commands is correctly implemented.
src/libspark/ownership_proof.h (2)
10-12
: Memory calculation logic appears correct but lacks validation.The
memoryRequired
method simply adds the memory requirements of the member variables. While this approach is generally correct, it doesn't account for potential padding between members or other implementation-specific memory allocations.Verify that the memory calculation is sufficient for your use cases. If precise memory allocation is important for your application, consider using
sizeof(*this)
or consulting with your platform's documentation about memory alignment requirements.
16-21
: Serialization implementation is correct and follows the existing conventions.The serialization code uses READWRITE for each member, which is consistent with the codebase's patterns.
src/wallet/rpcwallet.cpp (3)
3813-3823
: Support for Spark name resolution added tospendspark
This change adds the ability to use spark names (prefixed with '@') as recipients when spending spark funds. The code correctly retrieves the actual spark address from the
CSparkNameManager
when a name is provided.
3911-3989
: New RPC command for registering spark namesThis implementation adds a new RPC function to register human-readable aliases for spark addresses, similar to a domain name system for spark addresses. The implementation properly:
- Verifies wallet is unlocked and spark is activated
- Validates the spark name (length, format)
- Limits the number of years (1-10)
- Calculates the appropriate fee based on name length and years
- Creates and commits the transaction
This seems like a valuable feature that will improve user experience by allowing human-readable addresses instead of complex cryptographic ones.
5861-5861
: Registration of new RPC commandCorrectly registers the new
registersparkname
command in the RPC command table.src/wallet/wallet.cpp (1)
5872-5886
: New method for spark name transaction creation looks goodThis new method
CreateSparkNameTransaction
properly delegates the transaction creation to thesparkWallet
implementation after performing necessary wallet validation. The implementation follows the established pattern in the codebase with appropriate error handling.I particularly like the sanity checks before delegation:
- Ensuring the mint wallet is available
- Verifying the wallet is unlocked before proceeding
src/spark/sparkwallet.cpp (2)
268-269
: Looks good.Returning
isAddressMine(address)
neatly delegates the logic to the newly introduced overload.
271-290
: Validate potential out-of-range diversifier.After attempting to derive the diversifier (
viewKey.get_diversifier(address.get_d())
), this method converts it toint32_t
. Ifd
is large, conversion may lead to negative or out-of-range values forgetAddress(int32_t(d))
. Consider validating thatd
fits within the wallet’s feasible diversifier range.
Would you like a verification script that scans the codebase to check for range checks around diversifiers?src/chainparams.cpp (5)
173-179
: Ensure negative fee is safely handled.The
standardSparkNamesFee
array starts with-1
. If the code interprets negative fees incorrectly, it may cause unexpected behavior. Verify that all references tonSparkNamesFee
properly handle a negative sentinel value and do not attempt to charge or collect negative fees.
492-494
: Deferred Spark Names start block.Defining
nSparkNamesStartBlock = INT_MAX
effectively disables spark names on mainnet until updated. This is a straightforward approach to gating the feature.
799-800
: TestNet Spark Names activation.Spark names activate at block 174000 on testnet. Confirm that any existing test procedures accommodate this threshold. If integration tests rely on earlier blocks, they may need block rewinds or additional mocks.
1047-1048
: Devnet Spark Names.Setting
nSparkNamesStartBlock = 3500
is consistent with the typical devnet approach to early feature testing.
1291-1292
: Enable Spark Names on RegTest earlier.Starting at block 2000 on regtest is fine, allowing quick local testing. No concerns here.
src/secp256k1/src/cpp/Scalar.cpp (6)
217-217
: Switch tostd::runtime_error
is appropriate.Throwing standard C++ exceptions simplifies error handling and aligns with best practices.
259-259
: Checked exception for hash overflow.Good to see an explicit
std::runtime_error
ensuring overflow is handled properly.
295-295
: Decode overflow exception.Matches the established pattern for error handling, ensuring consistent usage of
std::runtime_error
.
317-317
: Length check consistency.Raising an exception for invalid length fosters robust safety checks.
328-328
: Invalid hex error.Using
std::runtime_error
clarifies that this is a recoverable runtime error, rather than a fatal condition.
337-337
: Duplicate overflow check.Consistent approach to manage decoding overflow.
src/libspark/keys.cpp (1)
262-290
: Validate if newly randomized scalars need further boundary checks.The proof code randomizes
a, b, c
and checks for an infinite group element reference. Everything else is straightforward. Ensure that the randomization approach doesn't produce edge cases (e.g., calling randomize might theoretically yield scalar zero, albeit rarely).
Would you like me to verify usage of zero-check logic for these random scalars in the codebase?src/libspark/keys.h (2)
7-8
: Appropriate header inclusions for new functionality.The addition of
uint256.h
andownership_proof.h
headers supports the new cryptographic ownership proof capabilities. These inclusions properly integrate with the ownership proof verification system needed for spark names.
87-98
:❓ Verification inconclusive
Verify correctness of cryptographic implementation.
These new cryptographic methods for ownership proofs are critical security components. Ensure they've been properly reviewed by cryptography experts and tested extensively against known vectors to confirm they provide the intended security guarantees.
🏁 Script executed:
#!/bin/bash # Check if the implementation has test coverage # Look for tests specific to these ownership proof methods echo "Searching for test coverage of ownership proof methods..." rg -l "prove_own|verify_own|challenge" --type cpp --type h src/testLength of output: 194
Action Required: Enhance Cryptographic Verification & Test Coverage
- The new cryptographic methods (i.e.,
challenge
,prove_own
, andverify_own
) insrc/libspark/keys.h
are critical security components.- The initial search in the
src/test
directory did not reveal any tests covering these methods. Please ensure that comprehensive tests—including edge cases and known vector validations—are added to the test suite.- It is imperative that these methods undergo an expert cryptographic review. Confirm via manual verification that the implementations meet the expected security standards.
src/validation.cpp (4)
54-54
: Added include for sparkname.hThe new include for
sparkname.h
is properly placed among other includes, enabling the new spark name functionality to be used in this file.
932-932
: Added sparkname transaction data storage variableThis new variable declaration is properly placed alongside other transaction validation-related declarations and will store the data related to spark name transactions during validation.
1018-1027
: Added code to check spark name transactions and detect conflictsThis block adds essential validation for spark name transactions by:
- Verifying the transaction against chainActive height
- Checking for conflicts with existing names in the mempool
- Rejecting transactions with conflicting names
The code follows best practices by returning early on validation failure and using the state object for error reporting.
1022-1026
:✅ Verification successful
Verify the lambda function for conflict detection
The lambda function used for conflict detection extracts the first element from the map value (the spark address). This approach seems correct if you need to compare specific attributes of conflicting transactions.
🏁 Script executed:
#!/bin/bash # Search for other uses of IsInConflict to confirm lambda usage pattern echo "Searching for CSparkNameManager::IsInConflict usages..." rg "CSparkNameManager::IsInConflict" --type cpp -A 2 -B 2 echo "Checking CSparkNameManager class implementation..." rg "class CSparkNameManager" --type cpp -A 10 -B 0 rg "IsInConflict" --type cpp -A 5 -B 5Length of output: 4644
Verification Complete: Lambda Conflict Detection Usage Validated
After reviewing the codebase:
- The lambda in
src/validation.cpp
returnsit->second.first
to extract the spark address, matching the intended design.- A similar usage in
src/spark/state.cpp
returnsit->second.sparkAddress
, which reflects a context-specific naming but follows the same extraction principle.- Both usages align with the overload of
CSparkNameManager::IsInConflict
in the header, and there’s no evidence of incorrect behavior.The lambda functions are correctly implemented for conflict detection based on their context. No changes are required.
src/libspark/test/ownership_test.cpp (4)
1-6
: Necessary includes are presentThe header files used include
keys.h
for Spark-related classes,test_bitcoin.h
for testing utilities, and Boost testing framework. These are appropriate for the test suite.
9-40
: Good test coverage for serializationThe serialization test properly checks that an OwnershipProof object can be correctly serialized and deserialized, ensuring that all key components of the proof (A, t1, t2, t3) are preserved. This is essential for ensuring ownership proofs can be correctly transmitted and stored.
42-69
: Well-designed completeness testThe test verifies that the ownership proof system works end-to-end, from creating a proof to verifying it after serialization/deserialization. This is a key test for cryptographic functionality.
71-106
: Comprehensive negative test casesThe bad_proofs test case is particularly valuable, as it ensures that the verification system correctly rejects altered proofs. Testing each component (A, t1, t2, t3) separately is a thorough approach.
src/qt/walletmodel.h (5)
160-162
: Appropriate method signature for Spark address generationThe
generateSparkAddress
method has a clear purpose and appropriate return type (QString). It will generate a new Spark address and return it as a string, which aligns with the feature requirements.
203-204
: Comprehensive validation method for Spark name dataThis method appropriately takes all required components (name, address, additional data) and returns a validation status with error information. The error reference parameter allows detailed error reporting to the UI.
205-206
: Well-designed transaction initialization methodThe method accepts a fee amount and returns a
WalletModelTransaction
object, which is consistent with the wallet's transaction handling pattern.
207-208
: Useful lookup method for Spark name resolutionThis method provides the essential functionality to look up a Spark address given a Spark name, which is a core requirement for the feature.
209-214
: Comprehensive transaction preparation methodThe method includes all necessary parameters (transaction object, name data, fee, and coin control) for preparing a Spark name registration transaction. The signature is consistent with other transaction preparation methods in the class.
src/qt/walletmodel.cpp (5)
1326-1339
: Well-implemented Spark address generationThe implementation correctly:
- Gets the default Spark parameters
- Uses a lock for thread safety
- Generates a new address using the wallet's Spark wallet
- Sets the address in the address book
- Returns the encoded address
This follows good practices for wallet operations.
1562-1577
: Verify sparkNameValidityBlocks parameter usageThe method sets
sparkNameValidityBlocks
to 1000 with a comment that it "doesn't matter", but this parameter isn't explained further.Please check if this arbitrary value actually doesn't impact validation in
CSparkNameManager::ValidateSparkNameData
. If it truly doesn't matter, consider:
- Using a named constant with proper documentation
- Adding a comment explaining why this value doesn't affect validation
- Or if possible, refactor the validation interface to not require this parameter if it's not used
1579-1588
: Solid implementation of transaction initializationThe method correctly:
- Gets consensus parameters to determine the development fund address
- Sets up a recipient with the appropriate address and fee amount
- Creates and returns a transaction model with the recipient
This implementation aligns with the feature requirement that Spark name registration fees are sent to a development fund address.
1590-1600
: Good error handling in getSparkNameAddressThe method properly:
- Uses a lock when accessing the Spark name manager
- Returns an empty string if the name doesn't exist
- Returns the address if found
This provides a clean interface for name resolution.
1602-1656
: Comprehensive error handling in prepareSparkNameTransactionThe method implements thorough error handling:
- Checks balance availability
- Uses appropriate locks
- Properly handles various exceptions
- Checks for absurd fees
- Sets transaction fee and updates amounts
The structure mirrors other transaction preparation methods in the class, which is good for consistency.
src/qt/createsparknamepage.h (3)
1-17
: Good header organization and includesThe header includes all necessary dependencies and uses proper include guards. The class is appropriately placed in the UI hierarchy.
18-40
: Well-structured dialog classThe
CreateSparkNamePage
class is well-designed:
- Inherits from QDialog
- Uses the Q_OBJECT macro for Qt's meta-object system
- Provides proper constructor/destructor
- Has a setModel method to connect to the wallet model
- Overrides accept() to handle dialog submission
This follows Qt's best practices for dialog implementation.
41-45
: Clear slot definitions for UI interactionsThe slots are appropriately defined for:
- Button click handling
- Text field changes
- Value changes
This provides the necessary connections between UI elements and application logic.
src/spark/state.cpp (7)
2-2
: Include usage appears fine.
No further comments on this line.
36-36
: Confirmed integration of sparkName manager in BuildSparkStateFromIndex.
This ensures SparkName-related data is initialized alongside SparkState. No issues found.
245-247
: Initialization of fBackupRewrittenSparkNames.
The logic to set this boolean value appears straightforward, and no concurrency issues are evident here.
316-329
: Adding spark name data to pindexNew.
The iteration over pblock->sparkTxInfo->sparkNames and population of pindexNew->addedSparkNames is clear. The code sets a flag to indicate that spark names were modified. Looks good overall.
343-346
: Updating spark names at block connection.
The call to RemoveSparkNamesLosingValidity followed by AddBlock ensures consistent updates to the manager’s state. Implementation appears correct.
389-392
: Properly removing spark name block data on disconnect.
DisconnectTipSpark now calls sparkNameManager->RemoveBlock(pindexDelete) to roll back spark names, which complements sparkState rollback. No issues found.
805-822
: Integration of spark name checks within CheckSparkTransaction flow.
The code checks conflicts with existing spark names before storing them in sparkTxInfo->sparkNames. The usage of a lambda for conflict checks can confuse certain static analyzers but is valid C++ code.🧰 Tools
🪛 Cppcheck (2.10-2)
[error] 811-811: Syntax Error
(internalAstError)
src/qt/createsparknamepage.cpp (4)
17-27
: Constructor sets up UI and initializes controls.
Defaulting numberOfYearsEdit to 1 and employing updateFee() are appropriate.
55-71
: Validation logic in accept() is straightforward.
It checks for invalid addresses and sparkName data. The branching is concise and user-friendly.
73-81
: Fee calculation logic depends on sparkName length and years.
The boundary check ensures we do not exceed the maximum spark name length, and the line references the correct fee array index. No immediate issues.
83-144
: Potential large multiplications for sparkNameFee.
Multiplying (name.length()) × numberOfYears × COIN is valid but could grow large if used incautiously. However, given the strict maximumSparkNameLength and years range, integer overflow risk is minimal. Implementation appears acceptable with existing constraints.src/qt/addresstablemodel.cpp (5)
27-27
: New SparkName constant introduced.
Reflects the new spark name address type. No issues.
96-103
: PendingSparkNameChange struct.
This structure stores spark name changes for later processing. Approach is logical.
105-113
: sparkNameAdded / sparkNameRemoved signal handlers.
Appending pending changes to a queue and locking cs_pendingSparkNameChanges is a reasonable concurrency pattern.
230-231
: Spark name detection via '@' prefix.
Label-based detection is straightforward. If label starts with '@', the code sets the address type to SparkName.Also applies to: 243-244
312-337
: processPendingSparkNameChanges.
Safely transitions changes from queued signals to the UI model. Lock usage is clear.src/qt/addressbookpage.cpp (4)
16-16
: Header import inclusion looks correct.
No issues here. Including the definition for the new CreateSparkNamePage dialog is straightforward and necessary.
247-253
: [Duplicate of the issue above] Mismatch between displayed text and enumerator check.
Applies the same logic from lines 211-217.
277-277
: [Duplicate of the issue above] Mismatch between displayed text and enumerator check.
Again, ensure that the correct mechanism (e.g.currentData() == SparkName
) is used.
433-442
: Potential partial string matching issue in filterAcceptsRow.
UsingdataStr.contains("spark name")
ordataStr.contains("spark")
can cause collisions and false positives. For instance, "spark name" also contains "spark". Consider exact matching or a more explicit classification field instead of substring matching.src/sparkname.cpp (2)
14-49
: Possible concurrency concern in AddBlock/RemoveBlock.
These methods update shared maps without apparent locking. Confirm that an appropriate global or chain lock is held to prevent race conditions.
158-160
: Verify assumption of 24 blocks per hour.
At line 158, code calculates blocks per year as365 * 24 * 24
. This is correct only if the chain block time is 2.5 minutes. Confirm that this matches the real block interval for Firo or it can overestimate/underestimate the required fee timeline.
@@ -61,6 +62,7 @@ public Q_SLOTS: | |||
|
|||
private: | |||
Ui::AddressBookPage *ui; | |||
const PlatformStyle *platformStyle; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Added platformStyle member variable needs initialization.
While adding this member variable is appropriate for styling, ensure that it's properly initialized in all constructors to avoid potential null pointer dereferences.
🏁 Script executed:
#!/bin/bash
# Verify that platformStyle is initialized in the AddressBookPage constructor
# Search for AddressBookPage constructor implementation
grep -A 10 "AddressBookPage::AddressBookPage" src/qt/addressbookpage.cpp
Length of output: 402
Initialization Needed: Ensure platformStyle
Member Is Properly Assigned
The constructor for AddressBookPage
uses the platformStyle
parameter (e.g., in if (!platformStyle->getImagesOnButtons())
) but doesn't assign it to the class member variable. This leaves the member uninitialized and potentially dangerous if later accessed.
-
Action Required: Update all constructors receiving a
PlatformStyle*
to initialize the member variable. For example, modify the initializer list inAddressBookPage::AddressBookPage
as follows:AddressBookPage::AddressBookPage(const PlatformStyle *platformStyle, Mode _mode, Tabs _tab, QWidget *parent, bool isReused) : QDialog(parent), ui(new Ui::AddressBookPage), model(0), mode(_mode), tab(_tab) + , platformStyle(platformStyle) { ui->setupUi(this); this->isReused = isReused; if (!platformStyle->getImagesOnButtons()) {
-
Verify Everywhere: Ensure that this change is applied in all constructors that accept a
PlatformStyle*
to prevent potential null pointer dereferences.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const PlatformStyle *platformStyle; | |
AddressBookPage::AddressBookPage(const PlatformStyle *platformStyle, Mode _mode, Tabs _tab, QWidget *parent, bool isReused) : | |
QDialog(parent), | |
ui(new Ui::AddressBookPage), | |
model(0), | |
mode(_mode), | |
tab(_tab), | |
platformStyle(platformStyle) | |
{ | |
ui->setupUi(this); | |
this->isReused = isReused; | |
if (!platformStyle->getImagesOnButtons()) { | |
// ... rest of the code | |
} | |
} |
@@ -104,6 +107,10 @@ QValidator::State BitcoinAddressCheckValidator::validate(QString &input, int &po | |||
|
|||
bool BitcoinAddressCheckValidator::validateSparkAddress(const std::string& address) const | |||
{ | |||
// check for spark name | |||
if (address[0] == '@' && address.size() <= CSparkNameManager::maximumSparkNameLength + 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify dependency on CSparkNameManager.
The code references CSparkNameManager::maximumSparkNameLength
but doesn't explicitly include the header that would define this constant. Ensure that the proper header is included either here or in a header included by this file.
🏁 Script executed:
#!/bin/bash
# Check if the CSparkNameManager is properly included
grep -r "class CSparkNameManager" --include="*.h" .
# Check which files include the header
grep -r "#include.*sparkname" --include="*.h" --include="*.cpp" .
Length of output: 787
Action Required: Explicitly Include CSparkNameManager Dependency
The file src/qt/bitcoinaddressvalidator.cpp
references CSparkNameManager::maximumSparkNameLength
, but our verification shows that this constant is defined in src/sparkname.h
—a header that isn’t directly included in this file. To avoid potential compilation issues if transitive includes change, please ensure that the proper header (e.g., #include "sparkname.h"
) is added either directly in bitcoinaddressvalidator.cpp
or via one of its included headers.
return sparkWallet && sparkWallet->isAddressMine(address); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Important fix for potential null pointer dereference
This change adds a null check for sparkWallet
before accessing its method, preventing a potential crash if the sparkWallet is not initialized. This is an important safety improvement.
src/spark/sparkwallet.cpp
Outdated
CWalletTx CSparkWallet::CreateSparkNameTransaction(CSparkNameTxData &nameData, CAmount sparkNameFee, CAmount &txFee, const CCoinControl *coinConrol) { | ||
CRecipient devPayout; | ||
devPayout.nAmount = sparkNameFee; | ||
devPayout.scriptPubKey = GetScriptForDestination(CBitcoinAddress(Params().GetConsensus().stage3DevelopmentFundAddress).Get()); | ||
devPayout.fSubtractFeeFromAmount = false; | ||
|
||
CWalletTx wtxSparkSpend = CreateSparkSpendTransaction({devPayout}, {}, txFee, coinConrol); | ||
|
||
const spark::Params* params = spark::Params::get_default(); | ||
spark::SpendKey spendKey(params); | ||
try { | ||
spendKey = std::move(generateSpendKey(params)); | ||
} catch (std::exception& e) { | ||
throw std::runtime_error(_("Unable to generate spend key.")); | ||
} | ||
|
||
if (spendKey == spark::SpendKey(params)) | ||
throw std::runtime_error(_("Unable to generate spend key, looks the wallet is locked.")); | ||
|
||
spark::Address address(spark::Params::get_default()); | ||
try { | ||
address.decode(nameData.sparkAddress); | ||
} catch (std::exception& e) { | ||
throw std::runtime_error(_("Invalid spark address")); | ||
} | ||
|
||
if (!isAddressMine(address)) | ||
throw std::runtime_error(_("Spark address doesn't belong to the wallet")); | ||
|
||
CMutableTransaction tx = CMutableTransaction(*wtxSparkSpend.tx); | ||
size_t additionalSize = 0; | ||
CSparkNameManager::GetInstance()->AppendSparkNameTxData(tx, nameData, spendKey, fullViewKey, additionalSize); | ||
|
||
txFee += CWallet::GetMinimumFee(additionalSize, nTxConfirmTarget, mempool); | ||
|
||
wtxSparkSpend.tx = MakeTransactionRef(std::move(tx)); | ||
return wtxSparkSpend; | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Reassess fee handling for appended Spark Name data.
The transaction initially includes a developer payout with sparkNameFee
, then appends name data and increases the fee afterward. If appending the data significantly changes the transaction size, you may need to recalculate and update the fee or re-run the transaction creation. Otherwise, the user might underpay or overpay fees in edge scenarios.
bool verify_own(const Scalar& m, | ||
OwnershipProof& proof) const; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Make the proof parameter const in verify_own method.
The verify_own
method takes a non-const reference to OwnershipProof
, but it appears to be used only for verification without modification. Making it const would improve const-correctness and better represent the method's behavior.
bool verify_own(const Scalar& m,
- OwnershipProof& proof) const;
+ const OwnershipProof& proof) const;
Additionally, consider adding documentation to explain the method's purpose and return value:
+/**
+ * Verifies a cryptographic proof of address ownership.
+ * @param m The scalar message/nonce used in the proof generation
+ * @param proof The ownership proof to verify
+ * @return true if the proof is valid, false otherwise
+ */
bool verify_own(const Scalar& m,
const OwnershipProof& proof) const;
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
bool verify_own(const Scalar& m, | |
OwnershipProof& proof) const; | |
/** | |
* Verifies a cryptographic proof of address ownership. | |
* @param m The scalar message/nonce used in the proof generation | |
* @param proof The ownership proof to verify | |
* @return true if the proof is valid, false otherwise | |
*/ | |
bool verify_own(const Scalar& m, | |
const OwnershipProof& proof) const; |
// 1-20 symbols, only alphanumeric characters and hyphens | ||
std::string name; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Documentation vs. actual code mismatch on allowed characters.
Comment states “only alphanumeric characters and hyphens,” but the implementation (in sparkname.cpp) allows a dot '.'
. Ensure the comment here matches the actual logic or remove the extra character restriction in code if dots are really allowed.
-// 1-20 symbols, only alphanumeric characters and hyphens
+// 1-20 symbols, only alphanumeric characters, hyphens, and optionally dots
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
// 1-20 symbols, only alphanumeric characters and hyphens | |
std::string name; | |
// 1-20 symbols, only alphanumeric characters, hyphens, and optionally dots | |
std::string name; |
bool CSparkNameManager::IsSparkNameValid(const std::string &name) | ||
{ | ||
if (name.size() < 1 || name.size() > maximumSparkNameLength) | ||
return false; | ||
|
||
for (char c: name) | ||
if (!isalnum(c) && c != '-' && c != '.') | ||
return false; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code allows '.' whereas the comment says “only alphanumeric and hyphens.”
This contradicts the comment in sparkname.h (lines 30-31). If you want to permit dots, update the doc. Otherwise, remove the dot allowance here.
-if (!isalnum(c) && c != '-' && c != '.')
+if (!isalnum(c) && c != '-')
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
bool CSparkNameManager::IsSparkNameValid(const std::string &name) | |
{ | |
if (name.size() < 1 || name.size() > maximumSparkNameLength) | |
return false; | |
for (char c: name) | |
if (!isalnum(c) && c != '-' && c != '.') | |
return false; | |
bool CSparkNameManager::IsSparkNameValid(const std::string &name) | |
{ | |
if (name.size() < 1 || name.size() > maximumSparkNameLength) | |
return false; | |
for (char c: name) | |
if (!isalnum(c) && c != '-') | |
return false; | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (5)
src/qt/createsparknamepage.cpp (5)
15-15
: Consider using a constexpr or static const variable for the confirmation delay.
Using a macro for this delay is acceptable but switching to a typed constant (e.g., static const int or constexpr) can improve type safety and readability.
96-96
: Make spark name validity blocks more configurable.
Currently,sparkNameData.sparkNameValidityBlocks
is set tonumberOfYears*365*24*24
. If your block time or day-calculation changes, the constant 24*24 may become inaccurate. Consider providing a named constant or calculating the block count based on the current chain’s block interval.
101-101
: Fix typo in error message.
"paramaeter" is spelled incorrectly. Consider changing it to "parameter".- QMessageBox::critical(this, tr("Error validating spark name paramaeter"), strError.c_str()); + QMessageBox::critical(this, tr("Error validating spark name parameter"), strError.c_str());
106-106
: Handle name length validation gracefully instead of using an assert.
If the user attempts to surpass the maximum name length, an assertion will stop the program in debug mode. Consider returning an error message to the UI instead, which prevents potential crashes in release builds.- assert(!name.empty() && name.length() <= CSparkNameManager::maximumSparkNameLength); + if (name.empty() || name.length() > CSparkNameManager::maximumSparkNameLength) { + QMessageBox::critical(this, tr("Error"), tr("Invalid spark name length.")); + return false; + }
125-126
: Improve grammar in the confirmation dialog message.
The current alert text omits an article: “to development fund transparent address.” Add “a” for clarity:- questionString.append(tr(" You are sending Firo from a Spark address to development fund transparent address.")); + questionString.append(tr(" You are sending Firo from a Spark address to a development fund transparent address."));
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/qt/createsparknamepage.cpp
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (2)
src/qt/createsparknamepage.cpp (2)
39-43
: Check for a null WalletModel pointer before generating a spark address.
Currently, the code callsmodel->generateSparkAddress()
without verifyingmodel
is not null, which could lead to a crash if the model isn't set properly. Consider adding a null check.Proposed diff:
void CreateSparkNamePage::on_generateButton_clicked() { + if (!model) { + QMessageBox::critical(this, tr("Error"), tr("Wallet model is not available.")); + return; + } QString newSparkAddress = model->generateSparkAddress(); ui->sparkAddressEdit->setText(newSparkAddress); }
73-81
: Validate array access for spark name fee.
Although there is a check preventing empty or overly long names in line 77, accessingnSparkNamesFee[sparkName.length()]
might still risk an out-of-bounds error if maximumSparkNameLength is changed elsewhere. Consider adding explicit handling or additional guards around the array index.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Nitpick comments (10)
src/libspark/ownership_proof.h (2)
10-12
: Consider updatingmemoryRequired()
when adding new fields.
Currently, the function explicitly calculates the memory usage based on the existingScalar
andGroupElement
. If you add more members in the future, remember to update this function accordingly to avoid discrepancy.
15-21
: Add a version strategy to handle structure changes gracefully.
The current serialization approach works well, but if more fields need to be serialized in future releases, introducing a versioned serialization scheme is recommended to ensure backward compatibility.src/spark/sparkwallet.cpp (2)
1625-1627
: Add more specific error message for address ownership.When a user attempts to create a spark name for an address they don't own, the error message could be more specific about the issue.
- if (!isAddressMine(address)) - throw std::runtime_error(_("Spark address doesn't belong to the wallet")); + if (!isAddressMine(address)) + throw std::runtime_error(_("Cannot create spark name: The provided spark address doesn't belong to this wallet"));
1604-1605
: Consider handling potential transaction creation failure.The
CreateSparkSpendTransaction
call could fail for several reasons, but there's no specific error handling for this. Consider adding a try-catch block to provide more specific error messages related to transaction creation failures.- CWalletTx wtxSparkSpend = CreateSparkSpendTransaction({devPayout}, {}, txFee, coinConrol, - sparkNameManager->GetSparkNameTxDataSize(nameData) + 20 /* add a little bit to the fee to be on the safe side */); + CWalletTx wtxSparkSpend; + try { + wtxSparkSpend = CreateSparkSpendTransaction({devPayout}, {}, txFee, coinConrol, + sparkNameManager->GetSparkNameTxDataSize(nameData) + 20 /* add a little bit to the fee to be on the safe side */); + } catch (const std::exception& e) { + throw std::runtime_error(strprintf(_("Failed to create spark name transaction: %s"), e.what())); + }src/sparkname.h (5)
35-36
: Consider adding more explanation about ownership proof mechanics.The comment could benefit from a more detailed explanation of how the ownership proof works and its purpose in preventing unauthorized name registrations.
- // proof of ownership of the spark address + // cryptographic proof that the transaction creator owns the private keys + // corresponding to the spark address, preventing unauthorized registrations
40-41
: Clarify the purpose of the hashFailsafe mechanism.The comment about the failsafe mechanism is technical but doesn't explain why this is necessary. Adding a clearer explanation would help future developers understand this field's purpose.
- // failsafe if the hash of the transaction data is can't be converted to a scalar for proof creation/verification + // Counter value used as entropy when the transaction hash cannot be converted to a valid scalar value. + // This allows retrying with different values until a valid proof can be created.
90-102
: Consider adding thread safety documentation for the singleton manager.The
CSparkNameManager
class is implemented as a singleton but has no documentation about thread safety. Since it's managing global state, it's important to document thread safety guarantees.Add a comment at the class level:
/** * Manager class for Spark names functionality. * This class is implemented as a singleton and manipulates global state. * Thread safety: Methods that modify state should be called with appropriate locks held. * Read operations are generally thread-safe unless concurrent modifications are occurring. */
147-150
: Improve function signature with a const parameter.The
ValidateSparkNameData
andGetSparkNameTxDataSize
methods don't modify the input parameter, so it should be marked as const.- bool ValidateSparkNameData(const CSparkNameTxData &sparkNameData, std::string &errorDescription); + bool ValidateSparkNameData(const CSparkNameTxData &sparkNameData, std::string &errorDescription) const; - size_t GetSparkNameTxDataSize(const CSparkNameTxData &sparkNameData); + size_t GetSparkNameTxDataSize(const CSparkNameTxData &sparkNameData) const;
38-39
: Add validation range for additionalInfo.Given that
additionalInfo
has a size limit of 1024 characters (as enforced in the validation logic), this constraint should be documented in the comment.- // additional information, string, up to 1024 symbols. Can be used for future extensions (e.g. for storing a web link) + // additional information, string, must be 1024 symbols or less. Can be used for future extensions (e.g. for storing a web link)src/sparkname.cpp (1)
229-257
: Improve error descriptions with more specific guidance.The error messages returned by
ValidateSparkNameData
could be more helpful by providing specific guidance on how to fix the issues.- errorDescription = "invalid spark name"; + errorDescription = "Invalid spark name. Use 1-20 alphanumeric characters, hyphens, or dots."; - errorDescription = "additional info is too long"; + errorDescription = "Additional info exceeds maximum length of 1024 characters."; - errorDescription = "transaction can't be valid for more than 10 years"; + errorDescription = "Validity period exceeds maximum of 10 years. Reduce the number of blocks."; - errorDescription = "name already exists with another spark address as a destination"; + errorDescription = "This name is already registered to a different spark address."; - errorDescription = "spark address is already used for another name"; + errorDescription = "This spark address is already associated with a different name."; - errorDescription = "spark name transaction with that name is already in the mempool"; + errorDescription = "A transaction for this spark name is already pending in the mempool. Wait for it to confirm or be rejected.";
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
src/libspark/ownership_proof.h
(1 hunks)src/spark/sparkwallet.cpp
(7 hunks)src/spark/sparkwallet.h
(3 hunks)src/sparkname.cpp
(1 hunks)src/sparkname.h
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- src/spark/sparkwallet.h
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (7)
src/libspark/ownership_proof.h (1)
24-28
: Investigate whether default initialization is sufficient.
It may be prudent to add a constructor that explicitly initializesA
,t1
,t2
, andt3
to known values (e.g., zero) when required for cryptographic correctness and clarity, preventing any uninitialized usage.src/spark/sparkwallet.cpp (4)
268-290
: Implementation looks solid!The
isAddressMine
method effectively checks if the address belongs to the wallet by comparing its public key components (Q1
andQ2
) with those of the addresses in the wallet. This implementation follows a good pattern of overloading to handle both string-encoded addresses and native address objects.
1224-1229
: Well-considered parameter addition for fee calculation.Adding the
additionalTxSize
parameter allows for better fee estimation before the transaction is finalized, especially when it will be modified further with additional data.
1315-1315
: Appropriate fee calculation update.Passing the additional transaction size to the
SelectSparkCoins
function ensures that fee calculations account for the eventual larger transaction size after appending spark name data.
1596-1633
: Reassess fee handling for appended Spark Name data.The transaction initially includes a developer payout with
sparkNameFee
, then appends name data and increases the fee afterward. If appending the data significantly changes the transaction size, you may need to recalculate and update the fee or re-run the transaction creation. Otherwise, the user might underpay or overpay fees in edge scenarios.src/sparkname.h (1)
30-31
: Documentation vs. actual code mismatch on allowed characters.Comment states "only alphanumeric characters and hyphens," but the implementation (in sparkname.cpp) allows a dot
'.'
. Ensure the comment here matches the actual logic or remove the extra character restriction in code if dots are really allowed.-// 1-20 symbols, only alphanumeric characters and hyphens +// 1-20 symbols, only alphanumeric characters, hyphens, and dotssrc/sparkname.cpp (1)
372-380
: Consistent allowed characters for spark names.The implementation allows dots (
.
) in spark names, but the comment in the header file (line 30-31) only mentions alphanumeric characters and hyphens. This inconsistency should be resolved.- if (!isalnum(c) && c != '-' && c != '.') + if (!isalnum(c) && c != '-')Or update the documentation in the header file to match the implementation.
return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkStartBlock)); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Incorrect error message reference to block height.
The error message refers to nSparkStartBlock
but should reference nSparkNamesStartBlock
, which is the actual variable being compared.
- return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkStartBlock));
+ return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkNamesStartBlock));
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkStartBlock)); | |
return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkNamesStartBlock)); |
#include "validation.h" | ||
#include "ui_interface.h" | ||
|
||
CSparkNameManager *CSparkNameManager::sharedSparkNameManager = new CSparkNameManager(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Use a thread-safe singleton initialization pattern.
The singleton initialization is not thread-safe. Consider using a Meyer's singleton or an atomic initialization.
-CSparkNameManager *CSparkNameManager::sharedSparkNameManager = new CSparkNameManager();
+CSparkNameManager *CSparkNameManager::GetInstance() {
+ static CSparkNameManager instance;
+ return &instance;
+}
And in the header file:
- static CSparkNameManager *GetInstance() { return sharedSparkNameManager; };
+ static CSparkNameManager *GetInstance();
for (uint32_t n=0; ; n++) { | ||
sparkNameData.addressOwnershipProof.clear(); | ||
sparkNameData.hashFailsafe = n; | ||
|
||
CMutableTransaction txCopy(txSparkSpend); | ||
CDataStream serializedSparkNameData(SER_NETWORK, PROTOCOL_VERSION); | ||
serializedSparkNameData << sparkNameData; | ||
txCopy.vExtraPayload.insert(txCopy.vExtraPayload.end(), serializedSparkNameData.begin(), serializedSparkNameData.end()); | ||
|
||
CHashWriter ss(SER_GETHASH, PROTOCOL_VERSION); | ||
ss << txCopy; | ||
|
||
spark::Scalar m; | ||
try { | ||
m.SetHex(ss.GetHash().ToString()); | ||
} | ||
catch (const std::exception &) { | ||
continue; // increase hashFailSafe and try again | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Potential infinite loop risk in AppendSparkNameTxData.
The function has an infinite loop with no upper bound on the hashFailsafe counter, which could theoretically run forever if a valid hash cannot be found. Consider adding a maximum number of attempts.
+ const uint32_t MAX_HASH_FAILSAFE_ATTEMPTS = 1000;
- for (uint32_t n=0; ; n++) {
+ for (uint32_t n=0; n < MAX_HASH_FAILSAFE_ATTEMPTS; n++) {
sparkNameData.addressOwnershipProof.clear();
sparkNameData.hashFailsafe = n;
// ... existing code ...
break;
}
+
+ if (sparkNameData.hashFailsafe >= MAX_HASH_FAILSAFE_ATTEMPTS) {
+ throw std::runtime_error("Failed to generate a valid transaction hash after maximum attempts");
+ }
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
for (uint32_t n=0; ; n++) { | |
sparkNameData.addressOwnershipProof.clear(); | |
sparkNameData.hashFailsafe = n; | |
CMutableTransaction txCopy(txSparkSpend); | |
CDataStream serializedSparkNameData(SER_NETWORK, PROTOCOL_VERSION); | |
serializedSparkNameData << sparkNameData; | |
txCopy.vExtraPayload.insert(txCopy.vExtraPayload.end(), serializedSparkNameData.begin(), serializedSparkNameData.end()); | |
CHashWriter ss(SER_GETHASH, PROTOCOL_VERSION); | |
ss << txCopy; | |
spark::Scalar m; | |
try { | |
m.SetHex(ss.GetHash().ToString()); | |
} | |
catch (const std::exception &) { | |
continue; // increase hashFailSafe and try again | |
} | |
const uint32_t MAX_HASH_FAILSAFE_ATTEMPTS = 1000; | |
for (uint32_t n = 0; n < MAX_HASH_FAILSAFE_ATTEMPTS; n++) { | |
sparkNameData.addressOwnershipProof.clear(); | |
sparkNameData.hashFailsafe = n; | |
CMutableTransaction txCopy(txSparkSpend); | |
CDataStream serializedSparkNameData(SER_NETWORK, PROTOCOL_VERSION); | |
serializedSparkNameData << sparkNameData; | |
txCopy.vExtraPayload.insert(txCopy.vExtraPayload.end(), serializedSparkNameData.begin(), serializedSparkNameData.end()); | |
CHashWriter ss(SER_GETHASH, PROTOCOL_VERSION); | |
ss << txCopy; | |
spark::Scalar m; | |
try { | |
m.SetHex(ss.GetHash().ToString()); | |
break; // Exit loop on successful hash generation | |
} | |
catch (const std::exception &) { | |
continue; // increase hashFailsafe and try again | |
} | |
} | |
if (sparkNameData.hashFailsafe >= MAX_HASH_FAILSAFE_ATTEMPTS) { | |
throw std::runtime_error("Failed to generate a valid transaction hash after maximum attempts"); | |
} |
constexpr int nBlockPerYear = 365*24*24; // 24 blocks per hour | ||
int nYears = (sparkNameData.sparkNameValidityBlocks + nBlockPerYear-1) / nBlockPerYear; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Incorrect calculation of blocks per year.
The calculation for blocks per year uses 24 blocks per hour twice, resulting in 2424365 which is incorrect. It should be either 24365 (daily blocks) or 2424*365 (hourly blocks).
- constexpr int nBlockPerYear = 365*24*24; // 24 blocks per hour
+ constexpr int nBlockPerYear = 365*24; // 24 blocks per day
or
- constexpr int nBlockPerYear = 365*24*24; // 24 blocks per hour
+ constexpr int nBlockPerYear = 365*24*60; // 1 block per minute (60 blocks per hour * 24 hours)
Adjust based on your actual block generation rate.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
constexpr int nBlockPerYear = 365*24*24; // 24 blocks per hour | |
int nYears = (sparkNameData.sparkNameValidityBlocks + nBlockPerYear-1) / nBlockPerYear; | |
// Adjust according to your actual block generation rate: | |
// Option 1: If you're generating 1 block per hour (24 blocks per day) | |
constexpr int nBlockPerYear = 365 * 24; // 24 blocks per day | |
// Option 2: If you're generating 1 block per minute (60 blocks per hour) | |
// constexpr int nBlockPerYear = 365 * 24 * 60; // 1 block per minute (1440 blocks per day) | |
int nYears = (sparkNameData.sparkNameValidityBlocks + nBlockPerYear - 1) / nBlockPerYear; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
src/test/sparkname_tests.cpp (1)
113-202
: Add boundary condition tests.
Currently, the test covers typical spark name creation and reorg scenarios, but it lacks explicit checks for zero-length or maximum-length names, extremely large validity durations, or edge-case fees. Adding these tests would increase coverage and help catch potential off-by-one or fee calculation issues.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
src/Makefile.test.include
(2 hunks)src/sparkname.cpp
(1 hunks)src/test/sparkname_tests.cpp
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- src/Makefile.test.include
🧰 Additional context used
🪛 Cppcheck (2.10-2)
src/test/sparkname_tests.cpp
[error] 111-111: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
src/sparkname.cpp
[performance] 377-377: Variable 'vHave' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 51-51: Variable 'hash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 27-27: Variable 'txid' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 65-65: Variable 'txid' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 69-69: Variable 'addressType' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 70-70: Variable 'addressHash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 147-147: Variable 'blockHash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 182-182: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 183-183: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 184-184: Variable 'txhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 216-216: Variable 'script' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 270-270: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 271-271: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 274-274: Variable 'txhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 311-311: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 312-312: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 344-344: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 345-345: Variable 'hashBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 23-23: Variable 'prevhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 44-44: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 45-45: Variable 'addressBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 46-46: Variable 'txhash' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 52-52: Variable 'type' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 53-53: Variable 'addressBytes' is assigned in constructor body. Consider performing initialization in initialization list.
(useInitializationList)
[performance] 10-10: Function parameter 'label' should be passed by const reference.
(passedByValue)
[performance] 12-12: Function parameter 'label' should be passed by const reference.
(passedByValue)
[performance] 81-81: Function parameter 'label' should be passed by const reference.
(passedByValue)
[performance] 105-105: Function parameter 'data' should be passed by const reference.
(passedByValue)
[performance] 37-37: Function parameter 'additional_data' should be passed by const reference.
(passedByValue)
[performance] 38-38: Function parameter 'associated_data' should be passed by const reference.
(passedByValue)
[performance] 74-74: Function parameter 'hash_bind_inner' should be passed by const reference.
(passedByValue)
[performance] 75-75: Function parameter '_name' should be passed by const reference.
(passedByValue)
[performance] 75-75: Function parameter '_sparkAddress' should be passed by const reference.
(passedByValue)
[performance] 75-75: Function parameter '_additionalInfo' should be passed by const reference.
(passedByValue)
[error] 106-106: Signed integer overflow for expression '1<<31'.
(integerOverflow)
[performance] 22-22: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 34-34: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 37-37: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 45-45: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 48-48: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
[performance] 83-83: Prefer prefix ++/-- operators for non-primitive types.
(postfixOperator)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (6)
src/test/sparkname_tests.cpp (1)
70-71
: Revisit year-block calculation.
This logic for computing the number of years based on24 * 24 * 365
might be inaccurate if the block rate differs from 2.5 minutes. A previous review flagged the same concern, suggesting that it's misaligned with the actual block interval.- int numberOfYears = (sparkNameData.sparkNameValidityBlocks + 24 * 24 * 365 - 1) / (24 * 24 * 365); + // Confirm block interval and recalculate blocks per year accordingly, + // e.g., for 10-minute blocks, use 6*24*365 => 52560 blocks per year.src/sparkname.cpp (5)
12-12
: Favor a thread-safe singleton pattern.
Instantiating the static pointer at file scope may lead to race conditions if accessed before initialization or concurrently from multiple threads. This issue was previously identified.-CSparkNameManager *CSparkNameManager::sharedSparkNameManager = new CSparkNameManager(); +CSparkNameManager *CSparkNameManager::GetInstance() { + static CSparkNameManager instance; + return &instance; +}🧰 Tools
🪛 Cppcheck (2.10-2)
[performance] 12-12: Function parameter 'label' should be passed by const reference.
(passedByValue)
152-153
: Reference the correct consensus parameter.
The error message usesconsensusParams.nSparkStartBlock
but should mentionnSparkNamesStartBlock
, as flagged in a past review.-return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkStartBlock)); +return state.DoS(100, error("CheckSparkNameTx: spark names are not allowed before block %d", consensusParams.nSparkNamesStartBlock));
158-158
: Validate block-per-year assumption.
This uses365*24*24
to represent a year, assuming 24 blocks per hour. If the actual chain block time differs, this could misprice name fees or validity durations.-constexpr int nBlockPerYear = 365*24*24; // 24 blocks per hour +// Revisit block frequency for an accurate estimation, e.g., 365 * 10 * 6 for 10-minute blocks.
283-302
: Add failsafe limit to prevent infinite loop.
Without a condition to break out after a certain number of attempts, this loop could theoretically run forever if no valid hash is found. This was raised in an older review.+ const uint32_t MAX_HASH_FAILSAFE_ATTEMPTS = 1000; for (uint32_t n=0; ; n++) { ... + if (n == MAX_HASH_FAILSAFE_ATTEMPTS - 1) { + throw std::runtime_error("Failed to generate a valid ownership proof after maximum attempts"); + } }
384-385
: Clarify allowed characters vs. documentation.
This code allows '.' in spark names, but the earlier documentation indicates only alphanumeric and hyphens. Previously, a similar discrepancy was reported.-if (!isalnum(c) && c != '-' && c != '.') +// Update documentation or remove '.' to match the doc specifying only alphanumeric & hyphens.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (5)
src/test/sparkname_tests.cpp (5)
35-38
: Consider revising assertion placement in utility methodThe utility method
GetSparkNameAdditionalData
contains aBOOST_CHECK
assertion, which is unusual for a utility method. Assertions in utility methods can make them less flexible as they will immediately fail the test rather than allowing the caller to handle the failure case.Consider refactoring to either:
- Return an optional value (std::optionalstd::string) or
- Throw an exception when the name is not present instead of asserting
- std::string GetSparkNameAdditionalData(const std::string &name) { - BOOST_CHECK(IsSparkNamePresent(name)); - return sparkNameManager->GetSparkNameAdditionalData(name); - } + std::string GetSparkNameAdditionalData(const std::string &name) { + std::string address; + if (!sparkNameManager->GetSparkAddress(name, address)) { + throw std::runtime_error("Spark name not present"); + } + return sparkNameManager->GetSparkNameAdditionalData(name); + }
69-72
: Add clarifying comments for validity period and fee calculationThe calculation for
numberOfYears
andsparkNameFee
could benefit from more explanation, particularly clarifying what24 * 24 * 365
represents and how it relates to the validity blocks parameter.if (sparkNameFee == 0) { BOOST_ASSERT(sparkNameData.name.length() <= CSparkNameManager::maximumSparkNameLength); - int numberOfYears = (sparkNameData.sparkNameValidityBlocks + 24 * 24 * 365 - 1) / (24 * 24 * 365); + // 24 blocks per hour * 24 hours * 365 days = blocks per year + const int blocksPerYear = 24 * 24 * 365; + // Calculate number of years, rounding up + int numberOfYears = (sparkNameData.sparkNameValidityBlocks + blocksPerYear - 1) / blocksPerYear; sparkNameFee = consensus.nSparkNamesFee[sparkNameData.name.length()] * COIN * numberOfYears; }
130-254
: Consider breaking down the test case into smaller focused testsThe test case is quite comprehensive but lengthy, making it harder to understand and maintain. Breaking it down into smaller, focused test cases would improve readability and maintainability.
Consider refactoring the
general
test case into multiple smaller test cases, each testing a specific aspect:BOOST_AUTO_TEST_CASE(basic_sparkname_creation_and_lookup) { // Test basic creation and lookup of spark names // Lines 132-141 } BOOST_AUTO_TEST_CASE(block_invalidation_and_reprocessing) { // Test effects of block invalidation on spark names // Lines 142-151 } BOOST_AUTO_TEST_CASE(mempool_conflicts) { // Test handling of mempool conflicts // Lines 153-174 } // And so on for other aspects tested
40-46
: Consider parameterizing the initialization methodThe
Initialize
method uses hardcoded values for the number of blocks and coin amounts. Making these configurable would allow for more flexible testing scenarios.- void Initialize() { + void Initialize(int blocks = 2000-1, std::vector<CAmount> amounts = {50 * COIN, 60 * COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN}) { std::vector<CMutableTransaction> mintTxs; - GenerateBlocks(2000-1); - GenerateMints({50 * COIN, 60 * COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN, 10*COIN}, mintTxs); + GenerateBlocks(blocks); + GenerateMints(amounts, mintTxs); GenerateBlock(mintTxs); pwalletMain->SetBroadcastTransactions(true); }
207-219
: Add explanatory comments for block generation flowThe flow of generating blocks and checking chain heights would benefit from more explanatory comments to make the test's intent clearer.
// shouldn't get into the block as well int oldHeight = chainActive.Height(); GenerateBlock({tx3}); +// Check that the block was rejected because of the invalid transaction BOOST_CHECK_EQUAL(chainActive.Height(), oldHeight); // one more block and testname should be gone GenerateBlock({}); BOOST_CHECK(!IsSparkNamePresent("testname")); oldHeight = chainActive.Height(); -// tx3 should go ahead now +// tx3 should be accepted now that the conflicting name has expired GenerateBlock({tx3}); +// Check that the block was successfully added to the chain BOOST_CHECK_EQUAL(chainActive.Height(), oldHeight+1);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/test/sparkname_tests.cpp
(1 hunks)
🧰 Additional context used
🪛 Cppcheck (2.10-2)
src/test/sparkname_tests.cpp
[error] 128-128: There is an unknown macro here somewhere. Configuration is required. If BOOST_FIXTURE_TEST_SUITE is a macro then please configure it.
(unknownMacro)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (3)
src/test/sparkname_tests.cpp (3)
102-117
: Well-designed test utility for transaction manipulationThe
ModifySparkNameTx
method is a good utility for testing edge cases by allowing direct manipulation of transaction data. It effectively abstracts away the complexity of parsing, modifying, and re-serializing the transaction data.
235-253
: Good test coverage for address ownership verificationThe tests for address ownership verification are thorough and verify both the original case (tampering with the proof) and a secondary case (modifying the name after generating the proof). This helps ensure that the ownership proof mechanism is robust against different types of modifications.
228-234
: Well-implemented fee calculation testsThe tests effectively verify that the fee calculation takes into account both the name length and validity period (in years). They include positive and negative test cases, ensuring the system rejects transactions with insufficient fees.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
src/rpc/blockchain.cpp (2)
247-248
: 🛠️ Refactor suggestionHandle potential failure of GetSparkAddress().
The code calls
GetSparkAddress()
but doesn't check its return value. If the function fails, the code will continue using potentially invalid data.std::string SparkAddr; - sparkNameManager->GetSparkAddress(sparkName, SparkAddr); + bool addressFound = sparkNameManager->GetSparkAddress(sparkName, SparkAddr); + if (!addressFound) { + throw JSONRPCError(RPC_INTERNAL_ERROR, "Failed to retrieve address for spark name"); + }
220-236
: 🛠️ Refactor suggestionImprove validation of input parameters.
The help text suggests exactly one parameter is required, but the check
request.params.size() > 1
would allowrequest.params.size() == 0
. Consider checking for exactly one parameter.- if (request.fHelp || request.params.size() > 1) { + if (request.fHelp || request.params.size() != 1) { throw std::runtime_error( "getsparknamedata ( sparkname )\n" "\nReturns info about spark name.\n" "\nArguments:\n" - "Spark name (string)\n" + "1. \"sparkname\" (string, required) The spark name to look up\n" "\nResult:\n"
🧹 Nitpick comments (2)
src/rpc/blockchain.cpp (2)
28-28
: Consider using a relative path from the project root for inclusion.The include path
"../sparkname.h"
uses a relative path that might be fragile if file organization changes. Consider using a path relative to the project root.-#include "../sparkname.h" +#include "sparkname.h"
180-216
: Handle GetSparkAddress() failure consistently.The
getsparknames
function retrieves addresses but doesn't handle the case whereGetSparkAddress()
fails - it simply skips adding the address to the entry. Consider providing an error message or indicator for addresses that couldn't be retrieved.UniValue entry(UniValue::VOBJ); entry.push_back(Pair("name", name)); std::string SparkAddr; - if (sparkNameManager->GetSparkAddress(name, SparkAddr)) + if (sparkNameManager->GetSparkAddress(name, SparkAddr)) { entry.push_back(Pair("address", SparkAddr)); + } else { + entry.push_back(Pair("address", "ERROR: Unable to retrieve address")); + } result.push_back(entry);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/rpc/blockchain.cpp
(3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (2)
src/rpc/blockchain.cpp (2)
251-252
: Unused network variable.The
network
variable is defined but never used in the function.UniValue result(UniValue::VARR); - unsigned char network = spark::GetNetworkType();
1746-1747
: Functions look well-registered in the command table.The new RPC functions are correctly registered in the command table with appropriate categorization.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (3)
src/rpc/blockchain.cpp (3)
244-249
:⚠️ Potential issueValidate spark name existence before retrieving data.
The function doesn't check if the provided spark name exists before trying to get its data, which could lead to undefined behavior.
std::string sparkName = request.params[0].get_str(); CSparkNameManager *sparkNameManager = CSparkNameManager::GetInstance(); + // Check if the spark name exists + if (!sparkNameManager->SparkNameExists(sparkName)) { + throw JSONRPCError(RPC_INVALID_PARAMETER, "Spark name does not exist"); + } std::string SparkAddr; sparkNameManager->GetSparkAddress(sparkName, SparkAddr);
250-252
:⚠️ Potential issueRemove unused network variable.
The
network
variable is defined but never used in the function.UniValue result(UniValue::VOBJ); - unsigned char network = spark::GetNetworkType();
247-248
:⚠️ Potential issueAdd error handling for GetSparkAddress failure.
The function should check the return value of
GetSparkAddress
and handle the case where it fails to retrieve an address.std::string SparkAddr; - sparkNameManager->GetSparkAddress(sparkName, SparkAddr); + bool addressFound = sparkNameManager->GetSparkAddress(sparkName, SparkAddr); + if (!addressFound) { + throw JSONRPCError(RPC_INTERNAL_ERROR, "Failed to retrieve address for spark name"); + }
🧹 Nitpick comments (2)
src/rpc/blockchain.cpp (2)
219-236
: Parameter description in help text needs clarification.The help text doesn't follow the standard format for argument descriptions. The parameter description should include the parameter number, name, type, and whether it's required.
"\nArguments:\n" - "Spark name (string)\n" + "1. \"sparkname\" (string, required) The spark name to query\n"
253-260
: Consider handling possible nulls in additional data.The function should handle the possibility of null or missing additional data for the spark name.
result.push_back(Pair("address", SparkAddr)); uint64_t nameBlockHeight = sparkNameManager->GetSparkNameBlockHeight(sparkName); result.push_back(Pair("validUntil", nameBlockHeight)); std::string sparkNameData = sparkNameManager->GetSparkNameAdditionalData(sparkName); + // Handle case where additional data might be empty + if (sparkNameData.empty()) { + sparkNameData = ""; + } result.push_back(Pair("additionalInfo", sparkNameData));
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/rpc/blockchain.cpp
(3 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (3)
src/rpc/blockchain.cpp (3)
28-28
: Include path looks good.This include provides access to the new
CSparkNameManager
class needed for Spark name operations.
180-216
: RPC command implementation looks solid.The
getsparknames
function correctly:
- Validates parameters
- Checks if Spark functionality is activated
- Retrieves and formats Spark names with their addresses
Good job on the thread safety with the
LOCK(cs_main)
and proper error handling when Spark isn't activated.
1746-1747
: RPC command registration looks good.The new RPC commands are properly registered in the command table.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
src/qt/addresstablemodel.cpp (1)
38-43
: Consider uncommenting or removing the AddressType enum.The enum is currently commented out but is still being modified. If it's being used elsewhere, it should be uncommented. If it's deprecated, consider removing it entirely to avoid confusion.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
src/qt/addressbookpage.cpp
(9 hunks)src/qt/addressbookpage.h
(2 hunks)src/qt/addresstablemodel.cpp
(12 hunks)src/rpc/blockchain.cpp
(1 hunks)src/rpc/client.cpp
(1 hunks)src/wallet/rpcwallet.cpp
(3 hunks)
✅ Files skipped from review due to trivial changes (1)
- src/rpc/blockchain.cpp
🚧 Files skipped from review as they are similar to previous changes (3)
- src/qt/addressbookpage.h
- src/rpc/client.cpp
- src/qt/addressbookpage.cpp
🧰 Additional context used
🧬 Code Definitions (2)
src/qt/addresstablemodel.cpp (1)
src/qt/addresstablemodel.h (4) (4)
address
(82-82)address
(87-87)address
(112-112)pubCoin
(113-113)
src/wallet/rpcwallet.cpp (5)
src/wallet/wallet.h (1) (1)
sparkNameData
(1105-1109)src/rpc/server.h (4) (4)
HelpExampleRpc
(202-202)name
(156-156)name
(157-157)name
(179-179)src/rpc/server.cpp (4) (4)
HelpExampleRpc
(592-596)HelpExampleRpc
(592-592)name
(367-373)name
(367-367)src/spark/state.h (3) (3)
IsSparkAllowed
(45-45)IsSparkAllowed
(46-46)GetNetworkType
(47-47)src/sparkname.h (10) (10)
name
(110-110)name
(119-119)name
(156-156)name
(157-157)name
(161-161)name
(163-163)sparkName
(170-170)GetNetworkType
(12-12)sparkNameData
(147-147)sparkNameData
(150-150)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Create Source Distribution
🔇 Additional comments (17)
src/wallet/rpcwallet.cpp (5)
3812-3823
: Enhancement tospendspark
function to support spark names.The modified code now checks if the name starts with '@' to interpret it as a spark name rather than a direct address. If so, it calls
GetSparkAddress
from the SparkNameManager to retrieve the corresponding spark address. This allows users to use human-readable aliases instead of complex spark addresses.
3911-3958
: New RPC method:getsparknames
implemented correctly.This function retrieves and lists all registered spark names, with an optional parameter to filter and show only names that belong to the wallet. The implementation properly validates that Spark is activated before proceeding and correctly formats the response with name and address pairs.
3960-4004
: New RPC method:getsparknamedata
implemented properly.This method retrieves detailed information about a specific spark name including its associated address, validity period (block height), and any additional metadata. The implementation checks for Spark activation and handles the data retrieval appropriately.
4006-4084
: New RPC method:registersparkname
implemented with proper validation.This function handles the registration of new spark names. It includes appropriate validation:
- Ensures wallet is unlocked and Spark is activated
- Validates name length (≤ 20 characters)
- Verifies the spark address format
- Calculates fee based on name length according to consensus parameters
- Creates and commits a transaction for registration
The implementation follows good error handling practices by providing descriptive error messages for various failure scenarios.
5956-5958
: Correctly registered new RPC methods in the command table.The three new spark name functions have been properly added to the RPC command table, which ensures they'll be available through the RPC interface.
src/qt/addresstablemodel.cpp (12)
27-27
: Adding SparkName constant looks good.This constant will be used to identify Spark name entries in the address table, which is consistent with the other address type constants.
50-54
: Good practice: initializing members and updating constructor.Adding the default initialization of
isMine
tofalse
and updating the constructor to include the_isMine
parameter is a good practice for maintaining object state consistency.
97-114
: Well-structured implementation for tracking pending changes.The
PendingSparkNameChange
struct and related methods for tracking changes are well-implemented:
- The struct properly encapsulates the change type and spark name data
- Thread safety is ensured with the critical section lock
- The methods follow a consistent pattern of appending changes to a pending list
This approach aligns with how other pending changes are handled in the codebase.
117-127
: Proper signal connection handling in constructor/destructor.The constructor connects to the spark name signals and the destructor disconnects them. This is important to prevent memory leaks and ensure proper cleanup when the object is destroyed.
129-145
: Well-implemented method for refreshing spark names.The
refreshSparkNames()
method:
- Retrieves spark names from the
CSparkNameManager
- Properly handles the spark address and name formatting
- Checks if the address is owned by the wallet using
IsSparkAddressMine
- Adds entries to the cache with the correct type and ownership information
This follows the same pattern as other address loading methods in the codebase.
151-152
: Proper lock for thread safety.Adding the
LOCK(cs_main)
is important for thread safety when accessing theCSparkNameManager
, which likely depends on blockchain state.
200-200
: Good integration of spark names refresh into the address table refresh.Calling
refreshSparkNames()
withinrefreshAddressTable()
ensures that spark names are loaded alongside other address types when the table is refreshed.
248-249
: Consistent spark name identification.This code consistently uses the "@" prefix to identify spark names, matching the approach used in the new address entry method.
317-342
: Well-structured method for processing pending spark name changes.The
processPendingSparkNameChanges
method:
- Safely extracts pending changes using proper locking
- Checks if addresses already exist in the model to determine the update type
- Updates entries with appropriate parameters including ownership status
- Follows the existing patterns for updating the address table
This implementation ensures that spark name changes are properly processed and reflected in the UI.
401-404
: Good UI display representation for spark names.The code properly differentiates between "own spark name" and "spark name" based on whether the address belongs to the wallet. This helps users identify their own spark names versus others in the UI.
783-786
: Public method provides good API access for processing changes.The
ProcessPendingSparkNameChanges
public method exposes the functionality to process pending spark name changes, which can be called from other parts of the application when necessary.
235-236
:❓ Verification inconclusive
Using label prefix to determine address type might be fragile.
The code determines if an address is a spark name by checking if the label starts with "@". This approach relies on a naming convention that could potentially be bypassed if labels are manually edited.
Consider whether there's a more robust way to identify spark names that doesn't depend on the label format.
🏁 Script executed:
#!/bin/bash # Check if there are any other methods to identify spark names in the codebase rg -A 2 -B 2 "sparkName" --glob "*.{cpp,h}"Length of output: 50403
Spark Name Identification – Verify Robustness
The code in question uses the pattern of checking whether the label starts with "@" to determine if an address is a spark name. Our investigation shows that this approach is used consistently throughout the codebase (for example, in sendcoinsdialog.cpp, createsparknamepage.cpp, and walletmodel.cpp). This suggests that the "@" prefix is an intentional convention for distinguishing spark names.
However, as noted, relying solely on the label format can be fragile—especially if users are allowed to manually edit labels. It might be worthwhile to consider reinforcing this logic by either:
- Storing an explicit flag (or property) in the address entry to denote spark names, or
- Cross-checking against the spark name registry (via
CSparkNameManager
), so that even if the label is modified, the address type can still be determined reliably.Please verify whether the current trade-off is acceptable for your project or if implementing one of these alternatives would improve robustness.
PR intention
Spark name is a way to create an easily remembered alias to a spark address. This PR adds both core and UI functionality for spark names creation and management. Fee is required to register a spark name, and it depends on spark name length (the shorter the length, the more expensive spark name registration is).
Code changes brief
Core functionality is added on top of spark spend transaction. When spark is spent to development fund address spark name metadata is expected to follow.