-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: single file build #181
Conversation
Review or Edit in CodeSandboxOpen the branch in Web Editor • VS Code • Insiders |
/update |
6 similar comments
/update |
/update |
/update |
/update |
/update |
/update |
/update |
…remaining issues in single file build!!! no network requests anymore. working decompress & smaller builds
/review |
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
Caution Review failedThe pull request is closed. WalkthroughThis pull request introduces a new GitHub Actions workflow for building a single-file version of the project and updates various configurations and scripts. Changes include adjustments to build scripts, environment-based conditional logic (using the Changes
Sequence Diagram(s)sequenceDiagram
participant U as User
participant GH as GitHub Actions
participant Repo as Repository
participant Build as Build Environment
participant Art as Artifact Storage
U->>GH: Trigger workflow_dispatch
GH->>Repo: Checkout repository
GH->>Build: Setup Node.js (v22) & Install pnpm
Build->>Repo: Install dependencies (pnpm install)
Build->>Build: Execute build-single-file command<br>(build & move file to minecraft.html)
Build->>Art: Upload minecraft.html as artifact
sequenceDiagram
participant App as Application Code
participant Env as Environment Variable (SINGLE_FILE_BUILD)
participant HTML as HTML Element ("mesher-worker-code")
participant Worker as Worker Instance
App->>Env: Check if SINGLE_FILE_BUILD == true
alt Single-file mode enabled
App->>HTML: Retrieve worker code from element
HTML->>App: Return inline worker script
App->>App: Create Blob & generate URL
App->>Worker: Instantiate Worker using Blob URL
else Default mode
App->>Worker: Instantiate Worker using predefined script path
end
Possibly related PRs
Suggested labels
Poem
Warning There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure. 🔧 ESLint
Scope: all 2 workspace projects 📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
/deploy |
Deployed to Vercel Preview: https://prismarine-q80bkayji-zaro.vercel.app |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
🧹 Nitpick comments (19)
renderer/viewer/lib/worldrendererCommon.ts (1)
194-201
: The worker initialization approach for single file builds looks good, but consider adding error handlingThe implementation properly loads the worker script from an HTML element when in single file build mode, which is essential for embedding all code in a single file. However, there are non-null assertions that could be risky.
Consider adding error handling to gracefully handle missing elements:
let worker: any if (process.env.SINGLE_FILE_BUILD) { - const workerCode = document.getElementById('mesher-worker-code')!.textContent! + const mesherElement = document.getElementById('mesher-worker-code') + if (!mesherElement || !mesherElement.textContent) { + console.error('Missing mesher-worker-code element or content') + throw new Error('Failed to initialize worker: missing mesher-worker-code element') + } + const workerCode = mesherElement.textContent const blob = new Blob([workerCode], { type: 'text/javascript' }) worker = new Worker(window.URL.createObjectURL(blob)) } else { worker = new Worker(src) }scripts/genLargeDataAliases.ts (1)
30-51
: Consider adding error handling to the decompression functionThe decompression function lacks error handling, which could cause silent failures or difficult-to-debug issues if the input is invalid or if the decompression fails.
function decompressFromBase64(input) { console.time('decompressFromBase64') // Decode the Base64 string + try { const binaryString = atob(input); const len = binaryString.length; const bytes = new Uint8Array(len); // Convert the binary string to a byte array for (let i = 0; i < len; i++) { bytes[i] = binaryString.charCodeAt(i); } // Decompress the byte array const decompressedData = pako.inflate(bytes, { to: 'string' }); console.timeEnd('decompressFromBase64') return decompressedData; + } catch (error) { + console.error('Error during decompression:', error); + console.timeEnd('decompressFromBase64') + throw new Error(`Failed to decompress data: ${error.message}`); + } }.github/workflows/build-single-file.yml (1)
1-32
: Well-structured GitHub Action workflowThe workflow is clean, follows best practices, and handles the complete process from building to artifact upload. Note that the workflow uses checkout@master, which might be less predictable than using a specific version.
Consider using a specific version of actions/checkout:
- uses: actions/checkout@master + uses: actions/checkout@v4This provides better version control and predictability for your CI/CD pipeline.
src/react/SignEditorProvider.tsx (1)
25-25
: State initialization withnull as any
could be improved.Using
null as any
for state initialization bypasses TypeScript's type safety. Consider defining a proper type or using a more specific type assertion.- const [proseMirrorView, setProseMirrorView] = useState(null as any) + const [proseMirrorView, setProseMirrorView] = useState<typeof import('./prosemirror-markdown').ProseMirrorView | null>(null)src/connect.ts (1)
57-62
: Consider avoiding global assignments.
Though assigningmcData
,PrismarineBlock
,PrismarineItem
, andpathfinder
to thewindow
object can be convenient, it may introduce potential conflicts with other scripts or libraries. Explore namespaced objects or dedicated singletons to contain these references.src/shims/minecraftData.ts (1)
26-26
: Resolve data loading concurrency.
Callingawait importLargeData('mcData')
and immediately resolving it is fine, but confirm that parallel calls won't interfere with the sharedoptimizedDataResolver
.rsbuild.config.ts (5)
18-18
: Ensure environment variable reliability.
const SINGLE_FILE_BUILD = process.env.SINGLE_FILE_BUILD === 'true'
is straightforward, but confirm that your build environment reliably setsSINGLE_FILE_BUILD
and won't introduce false positives.
20-24
: Implicit override of font references.
Patching the Pixelart Icons font CSS is a creative solution, but be aware this might break if new versions of the font or file references change. You may want to handle this gracefully or detect the correct snippet to replace.
65-104
: Refactor repetitive HTML tag injection logic.
Injecting tags and meta tags conditionally can become bloated. Consider a function or template approach to encapsulate the logic for toggling single-file vs. multi-file references.
114-120
: Evaluate potential performance impact of single-file build outputs.
Inlining all scripts, styles, and large assets (dataUriLimit
) can create huge files, leading to long initial load times. Monitor final bundle size and consider lazy loading or chunking for large assets.
205-220
: Post-build single-file transformation.
Replacingfavicon.png
andloading-bg.jpg
with inlined data is fine, but watch out for large size expansions. Also confirm thatmesher.js
is appended properly and that no script collisions occur.src/panorama.ts (3)
15-15
: Consider dynamic importing ofloadMinecraftData
.If single-file size is a concern, you could dynamically import
loadMinecraftData
insideinitDemoWorld()
to reduce initial load time.
149-154
: Wrap callback unloading with error handling.One failing callback could block the others. Surround each callback with a try/catch block to improve robustness.
for (const unloadPanoramaCallback of unloadPanoramaCallbacks) { - unloadPanoramaCallback() + try { + unloadPanoramaCallback() + } catch (err) { + console.error('Failed to unload panorama resource:', err) + } } unloadPanoramaCallbacks = []
163-232
: Potential heavy loop and large data usage ininitDemoWorld
.Filling a 200×200 area with random blocks could impact performance in the browser. Consider chunk-based loading or limiting the size for better responsiveness. Also verify references like
loadedData
to prevent runtime errors if it’s not loaded in single-file mode.scripts/makeOptimizedMcData.mjs (1)
46-58
: Turn hard-coded compression option into an environment-based toggle.Forcing
compressedOutput = true
might hamper troubleshooting. Consider using!!process.env.SINGLE_FILE_BUILD
or a config flag instead of a permanent boolean.- const compressedOutput = true + const compressedOutput = !!process.env.SINGLE_FILE_BUILDsrc/optimizeJson.ts (4)
21-31
: Consider normalizing the_proccessed
key name.The current key is spelled as
_proccessed
, which may be a spelling mistake and could cause confusion. You might rename it to_processed
to maintain clarity.- if (key === '_proccessed') { + if (key === '_processed') {
32-39
: Validate the error message formapId
.While the mapping logic is sound, consider refining the error message to clarify that the unmatched ID is actually a name or numeric ID. This can help troubleshoot mapping issues more quickly.
106-115
: Use more specific parameter types forrestoreMinecraftData
.All parameters are typed as
any
andstring
, which can reduce type safety. Consider introducing interfaces or stricter types to improve maintainability and catch potential data shape issues earlier.
258-258
: Document the newdataRestorer
parameter.This new optional parameter is helpful for post-processing the restored data. Add JSDoc or inline comments to clarify its purpose for future maintainers.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (24)
.github/workflows/build-single-file.yml
(1 hunks).github/workflows/publish.yml
(1 hunks)config.json
(1 hunks)index.html
(0 hunks)package.json
(3 hunks)renderer/rsbuildSharedConfig.ts
(2 hunks)renderer/viewer/lib/worldrendererCommon.ts
(1 hunks)rsbuild.config.ts
(6 hunks)scripts/genLargeDataAliases.ts
(1 hunks)scripts/genShims.ts
(1 hunks)scripts/makeOptimizedMcData.mjs
(6 hunks)scripts/testOptimizedMcdata.ts
(3 hunks)src/connect.ts
(3 hunks)src/devtools.ts
(1 hunks)src/index.ts
(5 hunks)src/optimizeJson.ts
(4 hunks)src/optionsStorage.ts
(2 hunks)src/panorama.ts
(4 hunks)src/react/MainMenuRenderApp.tsx
(1 hunks)src/react/SignEditor.stories.tsx
(2 hunks)src/react/SignEditor.tsx
(3 hunks)src/react/SignEditorProvider.tsx
(2 hunks)src/serviceWorker.ts
(1 hunks)src/shims/minecraftData.ts
(3 hunks)
💤 Files with no reviewable changes (1)
- index.html
🧰 Additional context used
🪛 Biome (1.9.4)
src/optimizeJson.ts
[error] 93-93: The catch clause that only rethrows the original error is useless.
An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.
(lint/complexity/noUselessCatch)
[error] 99-99: The catch clause that only rethrows the original error is useless.
An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.
(lint/complexity/noUselessCatch)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: build-and-deploy
🔇 Additional comments (46)
config.json (1)
4-4
: Improvement to proxy URL specificationThe change from a hostname to a full URL with protocol is a good practice as it removes ambiguity about which protocol to use. This is particularly important for a single file build where the application may need to make cross-origin requests, and using HTTPS is a security best practice.
.github/workflows/publish.yml (1)
56-57
: Added build step for single-file versionThis step properly implements the single-file build process, generating a standalone HTML file named
minecraft.html
. This aligns with the PR objective and creates an artifact that can be used independently of the rest of the distribution.Since this is a new build process, it would be good to verify that all necessary assets are included in the single file:
#!/bin/bash # Check if the build-single-file script exists in package.json jq '.scripts["build-single-file"]' package.json # Look for SINGLE_FILE_BUILD environment variable usage rg "SINGLE_FILE_BUILD" --type tssrc/react/SignEditor.stories.tsx (1)
2-2
: Updated SignEditor stories to use ProseMirrorView directlyThe code has been refactored to pass the
ProseMirrorView
component directly instead of using theisWysiwyg
flag. This change improves type safety and makes the component API more explicit about its dependencies.Make sure that this change is consistent with the implementation in the SignEditor component:
#!/bin/bash # Check the SignEditor component props to ensure they match cat src/react/SignEditor.tsx | rg "interface Props|type Props"Also applies to: 23-23
scripts/genShims.ts (1)
18-18
: LGTM: Clean code formattingThis additional whitespace improves readability by separating the import/initialization logic from the resource processing logic.
src/react/MainMenuRenderApp.tsx (1)
90-92
: LGTM: Proper handling for single file build modeThe code correctly adds a new condition to check for the single file build mode environment variable, while preserving the existing logic for development and production modes.
src/serviceWorker.ts (1)
5-5
: LGTM: Properly disables service worker for single file buildsThe change correctly prevents service worker registration when in single file build mode, which is appropriate since service workers typically handle caching and offline functionality that may not be needed in this context.
scripts/genLargeDataAliases.ts (2)
3-28
: Clean implementation of data compression strategyThis function elegantly handles switching between compressed and raw data sources based on the build configuration. The approach allows for flexibility while minimizing the final bundle size when compression is enabled.
53-59
: Good standalone execution supportThe self-executing logic is well-implemented, making the script usable both as an imported module and as a standalone command-line tool.
scripts/testOptimizedMcdata.ts (3)
2-2
: Good API refactoring to use specialized functionThe change from using a generic
JsonOptimizer.restoreData
to the more specializedrestoreMinecraftData
improves code semantics.
11-11
:✅ Verification successful
Verification needed for the updated data restoration logic
The implementation now uses
restoreMinecraftData
instead of the previous approach, but the function signature and behavior might be different.
🏁 Script executed:
#!/bin/bash # Verify that restoreMinecraftData is correctly implemented # Look for the restoreMinecraftData implementation echo "Checking restoreMinecraftData implementation:" rg -A 10 "export const restoreMinecraftData" src/Length of output: 799
Verification Passed – Data Restoration Logic is Correct
- Verified that
restoreMinecraftData
is implemented insrc/optimizeJson.ts
with the expected signature.- The function accepts
(allVersionData: any, type: string, version: string)
and includes logic to handlerecipes
type specifically.- The call in
scripts/testOptimizedMcdata.ts
usingstructuredClone(json)
matches the implementation requirements.
95-96
:✅ Verification successful
Inconsistent parameter passing for restoreData
Both calls to
JsonOptimizer.restoreData
now passundefined
as the third parameter. Make sure this change is consistent with how the function is defined.
🏁 Script executed:
#!/bin/bash # Check the signature of restoreData in JsonOptimizer # Find the definition of restoreData echo "Looking for restoreData implementation:" rg -A 5 "restoreData" src/optimizeJson.tsLength of output: 1000
Confirmation: Parameter Passing for
JsonOptimizer.restoreData
is ConsistentAfter reviewing the implementation in
src/optimizeJson.ts
, it's clear that therestoreData
function is defined to accept a third parameter—dataRestorer
—as an optional callback (((data) => void) | undefined
). In the test filescripts/testOptimizedMcdata.ts
, passingundefined
for this parameter on lines 95-96 is fully consistent with the function's signature.
- The definition in
src/optimizeJson.ts
confirms the optional nature of the third parameter.- Both test calls passing
undefined
align with the expected usage.There’s no inconsistency here, so no changes are needed.
renderer/rsbuildSharedConfig.ts (3)
81-86
: Good resource path handling for Minecraft data filesThe conditional path handling for Minecraft data files ensures that the correct resources are loaded based on the actual file paths, improving compatibility and preventing potential issues.
101-104
: Added support for .txt files as asset/sourceThis change ensures text files are properly handled in the build process, consistent with how other non-JavaScript resources are handled.
110-112
:❓ Verification inconclusive
Key change for single-file build: eager dynamic imports
Setting
dynamicImportMode
to'eager'
whenSINGLE_FILE_BUILD
is true ensures that all dynamic imports are resolved at build time rather than runtime, which is essential for creating a self-contained single file. This is a critical part of implementing the single-file build feature.Check that dynamicImportMode is properly documented and compatible with your version of rspack:
🏁 Script executed:
#!/bin/bash # Check rspack version and compatibility with dynamicImportMode='eager' # Get the rspack version echo "Checking @rspack/core version:" grep "@rspack/core" package.jsonLength of output: 103
Action Required: Verify @rspack/core dependency and dynamicImportMode compatibility
The execution of the version-check script returned no output, which suggests that the dependency for
@rspack/core
may not be explicitly declared in yourpackage.json
(or it’s referenced differently). Please confirm that:
- The package (or an equivalent rspack dependency) is correctly declared in your
package.json
and supports thedynamicImportMode: 'eager'
option.- The rspack documentation has been reviewed to ensure that enabling eager dynamic imports is compatible with your current version/configuration.
Manually verify the dependency setup and consult the rspack docs for confirmation.
src/react/SignEditorProvider.tsx (2)
77-81
: Good implementation of conditional dynamic import.The conditional import approach correctly handles the single file build scenario. When not in a single file build, it dynamically imports the ProseMirrorView component only when needed.
85-85
: Component prop passed correctly based on condition.The conditional passing of
ProseMirrorView
to theSignEditor
component is well implemented, maintaining the intended functionality while adapting for the single file build.src/optionsStorage.ts (3)
166-167
: Good implementation of environment-dependent storage key.Using a different localStorage key for the single file build prevents conflicts when opening HTML files locally in the browser, which is a thoughtful consideration for different deployment scenarios.
171-171
: Options correctly migrated using the environment-dependent storage key.The migration function now correctly uses the dynamic
localStorageKey
to retrieve stored options.
190-190
: Consistent use of storage key for saving options.The code correctly uses the same dynamic
localStorageKey
when saving options back to localStorage, ensuring consistency in how data is persisted.src/index.ts (5)
90-90
: Imports updated to match renamed function.The import statement correctly references the renamed
loadMinecraftData
function from the./connect
module, maintaining the functionality while adapting for the single file build.
114-114
: New data loading approach using importLargeData.The importLargeData function has been introduced to handle large data files, which is likely part of the single file build strategy for managing resource size.
417-417
: Function call updated to match renamed function.The code now correctly calls the renamed
loadMinecraftData
function instead of the previousdownloadMcDataOnConnect
.
433-433
: Blockstates models loading updated to use importLargeData.The approach for loading blockstates models has been updated to use the new
importLargeData
function instead of direct import, which is consistent with the single file build strategy.
875-875
: Status clearing after progress completion.The code now clears the loading screen status after the progress is completed, which improves the user experience by not leaving stale status messages.
src/react/SignEditor.tsx (3)
4-4
: Type import updated correctly.The import statement has been correctly updated to be a type-only import for ProseMirrorView, which is appropriate since it's now used only for type checking purposes in this file.
15-15
: Props interface updated to use ProseMirrorView type.The Props interface has been updated to replace the boolean
isWysiwyg
property with theProseMirrorView
component type, which provides better type safety and a more direct representation of what's needed.
25-26
: Component refactored to derive WYSIWYG state from props.The component now correctly derives the WYSIWYG state from the presence of the ProseMirrorView prop rather than having it passed as a separate boolean. This is a good refactoring that reduces prop redundancy.
src/connect.ts (4)
8-11
: Approve new imports for local modules.
These lines cleanly integrate references to local modules (globalState
,optionsStorage
, andbotSoundSystem
) and appear consistent.
3-7
:❓ Verification inconclusive
Verify imports' existence and consistency.
These new imports (MinecraftData, PrismarineBlock, PrismarineItem, pathfinder, importLargeData) look correct, but please ensure that all referenced dependencies are properly installed and that the bundler or runtime can resolve them without issues.Run the following script to verify install references:
🏁 Script executed:
#!/bin/bash # Verify that the newly imported packages exist in package.json rg "minecraft-data|prismarine-block|prismarine-item|mineflayer-pathfinder|large-data-aliases" package.json -A 5Length of output: 1266
Imports Verification: Dependency Entries Exist & Local Alias File Check Needed
- The external dependencies (
minecraft-data
,prismarine-block
,prismarine-item
, andmineflayer-pathfinder
) are correctly referenced in the project's package.json.- The local import (
importLargeData
from../generated/large-data-aliases
) should be confirmed by ensuring that the corresponding generated file exists in the repository.Please verify that the generated file for
large-data-aliases
is present and correctly referenced in your build configuration.
64-66
:✅ Verification successful
Confirm
viewer
scope.
This code referencesviewer.world
but doesn't import or defineviewer
within this file. If it's a global variable, ensure there's no risk of race conditions or undefined references.
🏁 Script executed:
#!/bin/bash # Search for the definition of `viewer` to ensure it's declared before use rg "(\s|^)viewer(\s|=|.)"Length of output: 21720
Global "viewer" confirmed and initialization order verified.
We found thatviewer
is declared as a global in src/globals.d.ts and instantiated in src/index.ts, which is consistent with its usage in src/connect.ts. As long as the global instantiation completes before this file’s execution, there isn’t a risk of race conditions or undefined references.
47-48
:❓ Verification inconclusive
Check updated function signature usage.
RenamingdownloadMcDataOnConnect
toloadMinecraftData
with an additionalimportBlockstatesModels
parameter is a breaking change. Ensure that all invocations have been updated across the codebase.
🏁 Script executed:
#!/bin/bash # Search all references to loadMinecraftData for correct parameter usage rg "loadMinecraftData"Length of output: 453
Action: Verify Consistency of New
loadMinecraftData
UsageThe function has been renamed and now accepts an additional optional parameter (
importBlockstatesModels
) with a default value offalse
. The recent scan confirms the following:
- src/panorama.ts: The invocation
await loadMinecraftData(version, true)
explicitly sets the new parameter.- src/index.ts: The invocation
await loadMinecraftData(version)
relies on the default value (false
).Please ensure that the default behavior in
src/index.ts
is intentional and aligns with the overall design since relying on the default may cause unexpected behavior if the previous implementation assumed different logic.src/shims/minecraftData.ts (2)
57-57
: Validate restored data structure.
restoreMinecraftData(optimizedDataResolver.resolvedData, dataType, version)
must ensure it merges or replaces the data in a type-safe manner. Verify no silent type mismatches occur.
2-5
:❓ Verification inconclusive
Check newly introduced data-loading imports.
Imports forrestoreMinecraftData
andimportLargeData
are newly introduced. Confirm that these modules are exported under the correct paths and that compatibility aligns with your bundler configuration.
🏁 Script executed:
#!/bin/bash # Verify that restoreMinecraftData & importLargeData are available in the indicated modules rg "export function restoreMinecraftData" src/optimizeJson.ts rg "export function importLargeData" generated/large-data-aliases.*Length of output: 206
Verify Data-Loading Import Paths
The export forrestoreMinecraftData
insrc/optimizeJson.ts
appears to be in order. However, the search forimportLargeData
in the expected module at../../generated/large-data-aliases
returned a “No such file or directory” error. Please manually verify that the file exists (and is correctly named and generated) with the proper export forimportLargeData
, and that the import path aligns with your bundler’s configuration. If the file is located elsewhere or generated under a different name, update the import accordingly.rsbuild.config.ts (6)
57-59
: Double-check config source fallback and favicon path.
configSource
toggles based onSINGLE_FILE_BUILD
. Verify that'BUNDLED'
config usage meets your runtime requirements. Also, confirm'favicon.png'
is available in all builds.
132-134
: Check for environment variable collisions.
IncludingSINGLE_FILE_BUILD
andSINGLE_FILE_BUILD_MODE
in the global environment is okay, but be aware these might collide or be overwritten if multiple scripts set them.
158-158
: Build plugin 'setup' approach looks good.
The approach for hooking intoonBeforeBuild
andonAfterBuild
is consistent with typical plugin flows.
162-170
: Check data optimization scripts.
Logic for checking existence ofminecraft-data-optimized.json
andmc-assets-compressed.js
is sound. Ensure thatgenLargeDataAliases(SINGLE_FILE_BUILD)
matches the intended optimization workflow.
14-16
:✅ Verification successful
Confirm new dependencies and usage.
genLargeDataAliases
,sharp
, andsupportedVersions
imports are new. Verify that they are installed, compatible with your environment (especiallysharp
on non-native runtimes), and properly referenced.
🏁 Script executed:
#!/bin/bash # Check package.json or lock file for these dependencies rg "genLargeDataAliases|sharp|supportedVersions" package.jsonLength of output: 214
Dependency Verification Confirmation
- Confirmed that
sharp
is correctly declared inpackage.json
(version^0.33.5
).- Noted that
genLargeDataAliases
is referenced in theprepare-project
script, confirming its expected inclusion as a local utility.- Verified that
supportedVersions
is a local module imported from./src/supportedVersions.mjs
, so no external dependency declaration is needed.- Please ensure that the
sharp
usage is validated on non-native runtimes if applicable.
221-231
:❓ Verification inconclusive
Validate service worker generation logic.
When not in single-file mode, thegenerateSW
call should produce a valid service-worker. Double-check thatadditionalManifestEntries
and caching logic won't conflict with your single-file approach.
🏁 Script executed:
#!/bin/bash #!/bin/bash # Inspect service-worker generation logic rg "workbox-build|generateSW" . -A 10Length of output: 4123
Confirm Service Worker Generation for Multi-File vs. Single-File Builds
The current implementation uses Workbox’sgenerateSW
only when service workers are enabled (i.e. whendisableServiceWorker
is false). Please verify that:• In multi-file builds, the generated service worker (with options like
skipWaiting
andclientsClaim
) is fully functional.
• The manifest entries returned bygetSwAdditionalEntries()
and the emptyglobPatterns
setting appropriately cover your caching needs without conflicting with your single-file build process.If the single-file build should disable or adjust service worker logic, consider gating this call accordingly.
src/panorama.ts (3)
8-11
: Imports look good.These imports are properly utilized later on (e.g.,
tweenJs
in pointermove logic,getSyncWorld
for the demo world). No immediate issues spotted.
73-73
: Callback array structure looks fine.Declaring this array for unloading resources is a neat approach. No issues found.
79-84
: Verify whether you need a fallback panorama.Returning early if
SINGLE_FILE_BUILD_MODE
is set and the resource pack is unavailable means no panorama is loaded. If that’s intentional, it’s fine. Otherwise, consider a fallback.if (process.env.SINGLE_FILE_BUILD_MODE && !panoramaUsesResourcePack) { void initDemoWorld() return + // If you'd like a simple fallback, remove the `return` statement + // and load a default panorama here instead. }package.json (3)
13-14
: Preparation and check-build refactor looks good.Consolidating generation scripts into
prepare-project
before type checks is a clean approach.
41-41
: Confirm release artifact glob support.If your release tooling doesn’t expand globs, attaching multiple files by name might be necessary. Verify that
{self-host.zip,minecraft.html}
is accepted.
176-176
: Adding “sharp” is a useful choice.No issues found with introducing “sharp” as an optional dependency for image processing tasks.
src/optimizeJson.ts (1)
306-311
: Confirm the side effects ofdataRestorer
usage.Calling
dataRestorer?.(data)
after constructingdata
implies a side effect on it or external components. Ensure that any mutation is intentional and that the return value does not need to be reassigned todata
.
@@ -3,7 +3,9 @@ | |||
import fs from 'fs' | |||
import { WorldRendererThree } from 'renderer/viewer/lib/worldrendererThree' | |||
import { enable, disable, enabled } from 'debug' | |||
import { Vec3 } from 'vec3' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Exposing Vec3 to the global window object
Adding Vec3 to the global window object allows access from the browser console and other scripts without explicit imports. While this is useful for debugging, exposing library objects globally is generally not recommended for production code.
Also applies to: 8-8
🏁 Script executed:
#!/bin/bash
# Check if Vec3 is being used directly in other parts of the codebase
# that might now rely on global access instead of imports
# Look for imports of Vec3 that might have been removed
rg "import.*Vec3.*from ['\"]vec3['\"]" --type ts
# Look for usage of Vec3 without imports
rg "Vec3" --type ts | grep -v "import.*Vec3"
Length of output: 15329
Action: Remove or Gated Global Exposure of Vec3
The verification confirms that while almost every module continues to import Vec3 explicitly, the devtools (and an instance in the mesher module) still expose Vec3 on the global object (e.g., via window.Vec3 = Vec3
and globalThis.Vec3 = Vec3
). This global assignment is useful for debugging but can lead to unintended use in production.
- Files to review:
src/devtools.ts
(line 8 showswindow.Vec3 = Vec3
)renderer/playground/mesher/mesher.ts
(global exposure viaglobalThis.Vec3 = Vec3
)
Recommendation:
Either remove these global assignments or wrap them in a check (e.g., verifying that the environment is not production) so that the debugging convenience does not affect production stability.
function processRecipes (current, prev, getData, version) { | ||
// can require the same multiple times per different versions | ||
if (current._proccessed) return | ||
const items = getData('items') | ||
const blocks = getData('blocks') | ||
const itemsIdsMap = Object.fromEntries(items.map((b) => [b.id, b.name])) | ||
const blocksIdsMap = Object.fromEntries(blocks.map((b) => [b.id, b.name])) | ||
for (const key of Object.keys(current)) { | ||
const mapId = (id) => { | ||
if (typeof id !== 'string' && typeof id !== 'number') throw new Error('Incorrect type') | ||
const mapped = itemsIdsMap[id] ?? blocksIdsMap[id] | ||
if (!mapped) { | ||
throw new Error(`No item/block name with id ${id}`) | ||
} | ||
return mapped | ||
} | ||
const processRecipe = (obj) => { | ||
// if (!obj) return | ||
// if (Array.isArray(obj)) { | ||
// obj.forEach((id, i) => { | ||
// obj[i] = mapId(obj[id]) | ||
// }) | ||
// } else if (obj && typeof obj === 'object') { | ||
// if (!'count metadata id'.split(' ').every(x => x in obj)) { | ||
// throw new Error(`process error: Unknown deep object pattern: ${JSON.stringify(obj)}`) | ||
// } | ||
// obj.id = mapId(obj.id) | ||
// } else { | ||
// throw new Error('unknown type') | ||
// } | ||
const parseRecipeItem = (item) => { | ||
if (typeof item === 'number') return mapId(item) | ||
if (Array.isArray(item)) return [mapId(item), ...item.slice(1)] | ||
if (!item) { | ||
return item | ||
} | ||
if ('id' in item) { | ||
item.id = mapId(item.id) | ||
return item | ||
} | ||
throw new Error('unhandled') | ||
} | ||
const maybeProccessShape = (shape) => { | ||
if (!shape) return | ||
for (const shapeRow of shape) { | ||
for (const [i, item] of shapeRow.entries()) { | ||
shapeRow[i] = parseRecipeItem(item) | ||
} | ||
} | ||
} | ||
if (obj.result) obj.result = parseRecipeItem(obj.result) | ||
maybeProccessShape(obj.inShape) | ||
maybeProccessShape(obj.outShape) | ||
if (obj.ingredients) { | ||
for (const [i, ingredient] of obj.ingredients.entries()) { | ||
obj.ingredients[i] = parseRecipeItem(ingredient) | ||
} | ||
} | ||
} | ||
try { | ||
const name = mapId(key) | ||
for (const [i, recipe] of current[key].entries()) { | ||
try { | ||
processRecipe(recipe) | ||
} catch (err) { | ||
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | ||
delete current[i] | ||
} | ||
} | ||
current[name] = current[key] | ||
} catch (err) { | ||
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | ||
} | ||
delete current[key] | ||
} | ||
current._proccessed = true | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix incorrect recipe parsing logic.
There are two critical issues:
- Mapping array items uses
mapId(item)
instead ofmapId(item[0])
. - Deleting an entry in the loop calls
delete current[i]
instead ofdelete current[key][i]
.
- if (Array.isArray(item)) return [mapId(item), ...item.slice(1)]
+ if (Array.isArray(item)) return [mapId(item[0]), ...item.slice(1)]
...
- delete current[i]
+ delete current[key][i]
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
function processRecipes (current, prev, getData, version) { | |
// can require the same multiple times per different versions | |
if (current._proccessed) return | |
const items = getData('items') | |
const blocks = getData('blocks') | |
const itemsIdsMap = Object.fromEntries(items.map((b) => [b.id, b.name])) | |
const blocksIdsMap = Object.fromEntries(blocks.map((b) => [b.id, b.name])) | |
for (const key of Object.keys(current)) { | |
const mapId = (id) => { | |
if (typeof id !== 'string' && typeof id !== 'number') throw new Error('Incorrect type') | |
const mapped = itemsIdsMap[id] ?? blocksIdsMap[id] | |
if (!mapped) { | |
throw new Error(`No item/block name with id ${id}`) | |
} | |
return mapped | |
} | |
const processRecipe = (obj) => { | |
// if (!obj) return | |
// if (Array.isArray(obj)) { | |
// obj.forEach((id, i) => { | |
// obj[i] = mapId(obj[id]) | |
// }) | |
// } else if (obj && typeof obj === 'object') { | |
// if (!'count metadata id'.split(' ').every(x => x in obj)) { | |
// throw new Error(`process error: Unknown deep object pattern: ${JSON.stringify(obj)}`) | |
// } | |
// obj.id = mapId(obj.id) | |
// } else { | |
// throw new Error('unknown type') | |
// } | |
const parseRecipeItem = (item) => { | |
if (typeof item === 'number') return mapId(item) | |
if (Array.isArray(item)) return [mapId(item), ...item.slice(1)] | |
if (!item) { | |
return item | |
} | |
if ('id' in item) { | |
item.id = mapId(item.id) | |
return item | |
} | |
throw new Error('unhandled') | |
} | |
const maybeProccessShape = (shape) => { | |
if (!shape) return | |
for (const shapeRow of shape) { | |
for (const [i, item] of shapeRow.entries()) { | |
shapeRow[i] = parseRecipeItem(item) | |
} | |
} | |
} | |
if (obj.result) obj.result = parseRecipeItem(obj.result) | |
maybeProccessShape(obj.inShape) | |
maybeProccessShape(obj.outShape) | |
if (obj.ingredients) { | |
for (const [i, ingredient] of obj.ingredients.entries()) { | |
obj.ingredients[i] = parseRecipeItem(ingredient) | |
} | |
} | |
} | |
try { | |
const name = mapId(key) | |
for (const [i, recipe] of current[key].entries()) { | |
try { | |
processRecipe(recipe) | |
} catch (err) { | |
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | |
delete current[i] | |
} | |
} | |
current[name] = current[key] | |
} catch (err) { | |
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | |
} | |
delete current[key] | |
} | |
current._proccessed = true | |
} | |
function processRecipes (current, prev, getData, version) { | |
// can require the same multiple times per different versions | |
if (current._proccessed) return | |
const items = getData('items') | |
const blocks = getData('blocks') | |
const itemsIdsMap = Object.fromEntries(items.map((b) => [b.id, b.name])) | |
const blocksIdsMap = Object.fromEntries(blocks.map((b) => [b.id, b.name])) | |
for (const key of Object.keys(current)) { | |
const mapId = (id) => { | |
if (typeof id !== 'string' && typeof id !== 'number') throw new Error('Incorrect type') | |
const mapped = itemsIdsMap[id] ?? blocksIdsMap[id] | |
if (!mapped) { | |
throw new Error(`No item/block name with id ${id}`) | |
} | |
return mapped | |
} | |
const processRecipe = (obj) => { | |
// if (!obj) return | |
// if (Array.isArray(obj)) { | |
// obj.forEach((id, i) => { | |
// obj[i] = mapId(obj[id]) | |
// }) | |
// } else if (obj && typeof obj === 'object') { | |
// if (!'count metadata id'.split(' ').every(x => x in obj)) { | |
// throw new Error(`process error: Unknown deep object pattern: ${JSON.stringify(obj)}`) | |
// } | |
// obj.id = mapId(obj.id) | |
// } else { | |
// throw new Error('unknown type') | |
// } | |
const parseRecipeItem = (item) => { | |
if (typeof item === 'number') return mapId(item) | |
if (Array.isArray(item)) return [mapId(item[0]), ...item.slice(1)] | |
if (!item) { | |
return item | |
} | |
if ('id' in item) { | |
item.id = mapId(item.id) | |
return item | |
} | |
throw new Error('unhandled') | |
} | |
const maybeProccessShape = (shape) => { | |
if (!shape) return | |
for (const shapeRow of shape) { | |
for (const [i, item] of shapeRow.entries()) { | |
shapeRow[i] = parseRecipeItem(item) | |
} | |
} | |
} | |
if (obj.result) obj.result = parseRecipeItem(obj.result) | |
maybeProccessShape(obj.inShape) | |
maybeProccessShape(obj.outShape) | |
if (obj.ingredients) { | |
for (const [i, ingredient] of obj.ingredients.entries()) { | |
obj.ingredients[i] = parseRecipeItem(ingredient) | |
} | |
} | |
} | |
try { | |
const name = mapId(key) | |
for (const [i, recipe] of current[key].entries()) { | |
try { | |
processRecipe(recipe) | |
} catch (err) { | |
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | |
delete current[key][i] | |
} | |
} | |
current[name] = current[key] | |
} catch (err) { | |
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | |
} | |
delete current[key] | |
} | |
current._proccessed = true | |
} |
@@ -9,7 +9,9 @@ | |||
"start2": "run-p dev-rsbuild watch-mesher", | |||
"build": "pnpm build-other-workers && rsbuild build", | |||
"build-analyze": "BUNDLE_ANALYZE=true rsbuild build && pnpm build-other-workers", | |||
"check-build": "tsx scripts/genShims.ts && tsc && pnpm build", | |||
"build-single-file": "SINGLE_FILE_BUILD=true rsbuild build", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix environment variable mismatch.
Your code checks SINGLE_FILE_BUILD_MODE
, but the script sets SINGLE_FILE_BUILD
. This mismatch will prevent logic from running as intended.
- "build-single-file": "SINGLE_FILE_BUILD=true rsbuild build",
+ "build-single-file": "SINGLE_FILE_BUILD_MODE=true rsbuild build",
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
"build-single-file": "SINGLE_FILE_BUILD=true rsbuild build", | |
"build-single-file": "SINGLE_FILE_BUILD_MODE=true rsbuild build", |
} catch (err) { | ||
// console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | ||
throw err | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Remove or refine the outer try/catch that only rethrows.
Similarly, this outer catch block does not provide additional handling. Either add unique handling logic or remove it.
} catch (err) {
// console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
- throw err
+ // Log or remove this block if not needed
}
Committable suggestion skipped: line range outside the PR's diff.
🧰 Tools
🪛 Biome (1.9.4)
[error] 99-99: The catch clause that only rethrows the original error is useless.
An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.
(lint/complexity/noUselessCatch)
try { | ||
processRecipe(recipe) | ||
} catch (err) { | ||
// console.warn(`${version} [warn] Removing incorrect recipe: ${err}`) | ||
// delete current[i] | ||
throw err |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Remove or refine the inner try/catch that only rethrows.
This catch block merely rethrows the error, making it a “no-useless-catch” case. If you want to log or handle the error specifically, add relevant logic; otherwise, consider removing the catch for cleaner code.
try {
processRecipe(recipe)
} catch (err) {
- throw err
+ // Log the error or remove the catch block
}
Committable suggestion skipped: line range outside the PR's diff.
🧰 Tools
🪛 Biome (1.9.4)
[error] 93-93: The catch clause that only rethrows the original error is useless.
An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.
(lint/complexity/noUselessCatch)
Summary by CodeRabbit
New Features
Refactor