-
Notifications
You must be signed in to change notification settings - Fork 28
chore(deps): update dependency esbuild to v0.25.8 #1165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
442321c
to
c63ba70
Compare
c63ba70
to
5ae8ff7
Compare
[puLL-Merge] - evanw/[email protected] Diffdiff --git CHANGELOG.md CHANGELOG.md
index 3574f2189a8..11c14a4a013 100644
--- CHANGELOG.md
+++ CHANGELOG.md
@@ -1,5 +1,97 @@
# Changelog
+## 0.25.8
+
+* Fix another TypeScript parsing edge case ([#4248](https://github.com/evanw/esbuild/issues/4248))
+
+ This fixes a regression with a change in the previous release that tries to more accurately parse TypeScript arrow functions inside the `?:` operator. The regression specifically involves parsing an arrow function containing a `#private` identifier inside the middle of a `?:` ternary operator inside a class body. This was fixed by propagating private identifier state into the parser clone used to speculatively parse the arrow function body. Here is an example of some affected code:
+
+ ```ts
+ class CachedDict {
+ #has = (a: string) => dict.has(a);
+ has = window
+ ? (word: string): boolean => this.#has(word)
+ : this.#has;
+ }
+ ```
+
+* Fix a regression with the parsing of source phase imports
+
+ The change in the previous release to parse [source phase imports](https://github.com/tc39/proposal-source-phase-imports) failed to properly handle the following cases:
+
+ ```ts
+ import source from 'bar'
+ import source from from 'bar'
+ import source type foo from 'bar'
+ ```
+
+ Parsing for these cases should now be fixed. The first case was incorrectly treated as a syntax error because esbuild was expecting the second case. And the last case was previously allowed but is now forbidden. TypeScript hasn't added this feature yet so it remains to be seen whether the last case will be allowed, but it's safer to disallow it for now. At least Babel doesn't allow the last case when parsing TypeScript, and Babel was involved with the source phase import specification.
+
+## 0.25.7
+
+* Parse and print JavaScript imports with an explicit phase ([#4238](https://github.com/evanw/esbuild/issues/4238))
+
+ This release adds basic syntax support for the `defer` and `source` import phases in JavaScript:
+
+ * `defer`
+
+ This is a [stage 3 proposal](https://github.com/tc39/proposal-defer-import-eval) for an upcoming JavaScript feature that will provide one way to eagerly load but lazily initialize imported modules. The imported module is automatically initialized on first use. Support for this syntax will also be part of the upcoming release of [TypeScript 5.9](https://devblogs.microsoft.com/typescript/announcing-typescript-5-9-beta/#support-for-import-defer). The syntax looks like this:
+
+ ```js
+ import defer * as foo from "<specifier>";
+ const bar = await import.defer("<specifier>");
+ ```
+
+ Note that this feature deliberately cannot be used with the syntax `import defer foo from "<specifier>"` or `import defer { foo } from "<specifier>"`.
+
+ * `source`
+
+ This is a [stage 3 proposal](https://github.com/tc39/proposal-source-phase-imports) for an upcoming JavaScript feature that will provide another way to eagerly load but lazily initialize imported modules. The imported module is returned in an uninitialized state. Support for this syntax may or may not be a part of TypeScript 5.9 (see [this issue](https://github.com/microsoft/TypeScript/issues/61216) for details). The syntax looks like this:
+
+ ```js
+ import source foo from "<specifier>";
+ const bar = await import.source("<specifier>");
+ ```
+
+ Note that this feature deliberately cannot be used with the syntax `import defer * as foo from "<specifier>"` or `import defer { foo } from "<specifier>"`.
+
+ This change only adds support for this syntax. These imports cannot currently be bundled by esbuild. To use these new features with esbuild's bundler, the imported paths must be external to the bundle and the output format must be set to `esm`.
+
+* Support optionally emitting absolute paths instead of relative paths ([#338](https://github.com/evanw/esbuild/issues/338), [#2082](https://github.com/evanw/esbuild/issues/2082), [#3023](https://github.com/evanw/esbuild/issues/3023))
+
+ This release introduces the `--abs-paths=` feature which takes a comma-separated list of situations where esbuild should use absolute paths instead of relative paths. There are currently three supported situations: `code` (comments and string literals), `log` (log message text and location info), and `metafile` (the JSON build metadata).
+
+ Using absolute paths instead of relative paths is not the default behavior because it means that the build results are no longer machine-independent (which means builds are no longer reproducible). Absolute paths can be useful when used with certain terminal emulators that allow you to click on absolute paths in the terminal text and/or when esbuild is being automatically invoked from several different directories within the same script.
+
+* Fix a TypeScript parsing edge case ([#4241](https://github.com/evanw/esbuild/issues/4241))
+
+ This release fixes an edge case with parsing an arrow function in TypeScript with a return type that's in the middle of a `?:` ternary operator. For example:
+
+ ```ts
+ x = a ? (b) : c => d;
+ y = a ? (b) : c => d : e;
+ ```
+
+ The `:` token in the value assigned to `x` pairs with the `?` token, so it's not the start of a return type annotation. However, the first `:` token in the value assigned to `y` is the start of a return type annotation because after parsing the arrow function body, it turns out there's another `:` token that can be used to pair with the `?` token. This case is notable as it's the first TypeScript edge case that esbuild has needed a backtracking parser to parse. It has been addressed by a quick hack (cloning the whole parser) as it's a rare edge case and esbuild doesn't otherwise need a backtracking parser. Hopefully this is sufficient and doesn't cause any issues.
+
+* Inline small constant strings when minifying
+
+ Previously esbuild's minifier didn't inline string constants because strings can be arbitrarily long, and this isn't necessarily a size win if the string is used more than once. Starting with this release, esbuild will now inline string constants when the length of the string is three code units or less. For example:
+
+ ```js
+ // Original code
+ const foo = 'foo'
+ console.log({ [foo]: true })
+
+ // Old output (with --minify --bundle --format=esm)
+ var o="foo";console.log({[o]:!0});
+
+ // New output (with --minify --bundle --format=esm)
+ console.log({foo:!0});
+ ```
+
+ Note that esbuild's constant inlining only happens in very restrictive scenarios to avoid issues with TDZ handling. This change doesn't change when esbuild's constant inlining happens. It only expands the scope of it to include certain string literals in addition to numeric and boolean literals.
+
## 0.25.6
* Fix a memory leak when `cancel()` is used on a build context ([#4231](https://github.com/evanw/esbuild/issues/4231))
diff --git cmd/esbuild/main.go cmd/esbuild/main.go
index 464c2db4468..3ff4481efdd 100644
--- cmd/esbuild/main.go
+++ cmd/esbuild/main.go
@@ -54,9 +54,10 @@ var helpText = func(colors logger.Colors) string {
safari11, edge16, node10, ie9, opera45, default esnext)
--watch Watch mode: rebuild on file system changes (stops when
stdin is closed, use "--watch=forever" to ignore stdin)
- --watch-delay=... How many milliseconds to wait before watch mode rebuilds
` + colors.Bold + `Advanced options:` + colors.Reset + `
+ --abs-paths=... Emit absolute instead of relative paths in these
+ situations (code | log | metafile)
--allow-overwrite Allow output files to overwrite input files
--analyze Print a report about the contents of the bundle
(use "--analyze=verbose" for a detailed report)
@@ -131,6 +132,7 @@ var helpText = func(colors logger.Colors) string {
--tsconfig=... Use this tsconfig.json file instead of other ones
--tsconfig-raw=... Override all tsconfig.json files with this string
--version Print the current version (` + esbuildVersion + `) and exit
+ --watch-delay=... Wait before watch mode rebuilds (in milliseconds)
` + colors.Bold + `Examples:` + colors.Reset + `
` + colors.Dim + `# Produces dist/entry_point.js and dist/entry_point.js.map` + colors.Reset + `
diff --git cmd/esbuild/service.go cmd/esbuild/service.go
index 24721aca1a9..ee2b439c2c9 100644
--- cmd/esbuild/service.go
+++ cmd/esbuild/service.go
@@ -1397,8 +1397,9 @@ func decodeLocationToPrivate(value interface{}) *logger.MsgLocation {
if namespace == "" {
namespace = "file"
}
+ file := loc["file"].(string)
return &logger.MsgLocation{
- File: loc["file"].(string),
+ File: logger.PrettyPaths{Abs: file, Rel: file},
Namespace: namespace,
Line: loc["line"].(int),
Column: loc["column"].(int),
diff --git cmd/esbuild/version.go cmd/esbuild/version.go
index 2ceac75df37..6e399eeda6c 100644
--- cmd/esbuild/version.go
+++ cmd/esbuild/version.go
@@ -1,3 +1,3 @@
package main
-const esbuildVersion = "0.25.6"
+const esbuildVersion = "0.25.8"
diff --git compat-table/src/index.ts compat-table/src/index.ts
index f01b7b837b1..b719017a45a 100644
--- compat-table/src/index.ts
+++ compat-table/src/index.ts
@@ -59,7 +59,9 @@ export const jsFeatures = {
Hashbang: true,
ImportAssertions: true,
ImportAttributes: true,
+ ImportDefer: true,
ImportMeta: true,
+ ImportSource: true,
InlineScript: true,
LogicalAssignment: true,
NestedRestBinding: true,
diff --git internal/ast/ast.go internal/ast/ast.go
index 67d2e5b8a54..96c4e54227a 100644
--- internal/ast/ast.go
+++ internal/ast/ast.go
@@ -78,6 +78,18 @@ func (kind ImportKind) MustResolveToCSS() bool {
return false
}
+type ImportPhase uint8
+
+const (
+ EvaluationPhase ImportPhase = iota
+
+ // See: https://github.com/tc39/proposal-defer-import-eval
+ DeferPhase
+
+ // See: https://github.com/tc39/proposal-source-phase-imports
+ SourcePhase
+)
+
type ImportRecordFlags uint16
const (
@@ -167,6 +179,7 @@ type ImportRecord struct {
CopySourceIndex Index32
Flags ImportRecordFlags
+ Phase ImportPhase
Kind ImportKind
}
diff --git internal/bundler/bundler.go internal/bundler/bundler.go
index b6b6c0c3f9b..9c80c46ca3f 100644
--- internal/bundler/bundler.go
+++ internal/bundler/bundler.go
@@ -84,7 +84,7 @@ type parseArgs struct {
log logger.Log
res *resolver.Resolver
caches *cache.CacheSet
- prettyPath string
+ prettyPaths logger.PrettyPaths
importSource *logger.Source
importWith *ast.ImportAssertOrWith
sideEffects graph.SideEffects
@@ -110,7 +110,7 @@ type parseResult struct {
type globResolveResult struct {
resolveResults map[string]resolver.ResolveResult
absPath string
- prettyPath string
+ prettyPaths logger.PrettyPaths
exportAlias string
}
@@ -142,7 +142,7 @@ func parseFile(args parseArgs) {
source := logger.Source{
Index: args.sourceIndex,
KeyPath: args.keyPath,
- PrettyPath: args.prettyPath,
+ PrettyPaths: args.prettyPaths,
IdentifierName: js_ast.GenerateNonUniqueNameFromPath(pathForIdentifierName),
}
@@ -170,6 +170,7 @@ func parseFile(args parseArgs) {
args.importPathRange,
args.pluginData,
args.options.WatchMode,
+ args.options.LogPathStyle,
)
if !ok {
if args.inject != nil {
@@ -250,7 +251,7 @@ func parseFile(args parseArgs) {
r := recover()
if r != nil {
args.log.AddErrorWithNotes(nil, logger.Range{},
- fmt.Sprintf("panic: %v (while parsing %q)", r, source.PrettyPath),
+ fmt.Sprintf("panic: %v (while parsing %q)", r, source.PrettyPaths.Select(args.options.LogPathStyle)),
[]logger.MsgData{{Text: helpers.PrettyPrintedStack()}})
args.results <- result
}
@@ -415,9 +416,9 @@ func parseFile(args parseArgs) {
default:
var message string
if source.KeyPath.Namespace == "file" && ext != "" {
- message = fmt.Sprintf("No loader is configured for %q files: %s", ext, source.PrettyPath)
+ message = fmt.Sprintf("No loader is configured for %q files: %s", ext, source.PrettyPaths.Select(args.options.LogPathStyle))
} else {
- message = fmt.Sprintf("Do not know how to load path: %s", source.PrettyPath)
+ message = fmt.Sprintf("Do not know how to load path: %s", source.PrettyPaths.Select(args.options.LogPathStyle))
}
tracker := logger.MakeLineColumnTracker(args.importSource)
args.log.AddError(&tracker, args.importPathRange, message)
@@ -468,11 +469,18 @@ func parseFile(args parseArgs) {
// Special-case glob pattern imports
if record.GlobPattern != nil {
prettyPath := helpers.GlobPatternToString(record.GlobPattern.Parts)
+ phase := ""
+ switch record.Phase {
+ case ast.DeferPhase:
+ phase = ".defer"
+ case ast.SourcePhase:
+ phase = ".source"
+ }
switch record.GlobPattern.Kind {
case ast.ImportRequire:
- prettyPath = fmt.Sprintf("require(%q)", prettyPath)
+ prettyPath = fmt.Sprintf("require%s(%q)", phase, prettyPath)
case ast.ImportDynamic:
- prettyPath = fmt.Sprintf("import(%q)", prettyPath)
+ prettyPath = fmt.Sprintf("import%s(%q)", phase, prettyPath)
}
if results, msg := args.res.ResolveGlob(absResolveDir, record.GlobPattern.Parts, record.GlobPattern.Kind, prettyPath); results != nil {
if msg != nil {
@@ -481,7 +489,11 @@ func parseFile(args parseArgs) {
if result.globResolveResults == nil {
result.globResolveResults = make(map[uint32]globResolveResult)
}
+ allAreExternal := true
for key, result := range results {
+ if !result.PathPair.IsExternal {
+ allAreExternal = false
+ }
result.PathPair.Primary.ImportAttributes = attrs
if result.PathPair.HasSecondary() {
result.PathPair.Secondary.ImportAttributes = attrs
@@ -491,8 +503,17 @@ func parseFile(args parseArgs) {
result.globResolveResults[uint32(importRecordIndex)] = globResolveResult{
resolveResults: results,
absPath: args.fs.Join(absResolveDir, "(glob)"),
- prettyPath: fmt.Sprintf("%s in %s", prettyPath, result.file.inputFile.Source.PrettyPath),
- exportAlias: record.GlobPattern.ExportAlias,
+ prettyPaths: logger.PrettyPaths{
+ Abs: fmt.Sprintf("%s in %s", prettyPath, result.file.inputFile.Source.PrettyPaths.Abs),
+ Rel: fmt.Sprintf("%s in %s", prettyPath, result.file.inputFile.Source.PrettyPaths.Rel),
+ },
+ exportAlias: record.GlobPattern.ExportAlias,
+ }
+
+ // Forbid bundling of imports with explicit phases
+ if record.Phase != ast.EvaluationPhase {
+ reportExplicitPhaseImport(args.log, &tracker, record.Range,
+ record.Phase, allAreExternal, args.options.OutputFormat)
}
} else {
args.log.AddError(&tracker, record.Range, fmt.Sprintf("Could not resolve %s", prettyPath))
@@ -531,6 +552,7 @@ func parseFile(args parseArgs) {
record.Kind,
absResolveDir,
pluginData,
+ args.options.LogPathStyle,
)
if resolveResult != nil {
resolveResult.PathPair.Primary.ImportAttributes = attrs
@@ -570,8 +592,9 @@ func parseFile(args parseArgs) {
// fallback.
if !entry.didLogError && !record.Flags.Has(ast.HandlesImportErrors) {
// Report an error
- text, suggestion, notes := ResolveFailureErrorTextSuggestionNotes(args.res, record.Path.Text, record.Kind,
- pluginName, args.fs, absResolveDir, args.options.Platform, source.PrettyPath, entry.debug.ModifiedImportPath)
+ text, suggestion, notes := ResolveFailureErrorTextSuggestionNotes(
+ args.res, record.Path.Text, record.Kind, pluginName, args.fs, absResolveDir, args.options.Platform,
+ source.PrettyPaths, entry.debug.ModifiedImportPath, args.options.LogPathStyle)
entry.debug.LogErrorMsg(args.log, &source, record.Range, text, suggestion, notes)
// Only report this error once per unique import path in the file
@@ -587,6 +610,12 @@ func parseFile(args parseArgs) {
continue
}
+ // Forbid bundling of imports with explicit phases
+ if record.Phase != ast.EvaluationPhase {
+ reportExplicitPhaseImport(args.log, &tracker, record.Range,
+ record.Phase, entry.resolveResult.PathPair.IsExternal, args.options.OutputFormat)
+ }
+
result.resolveResults[importRecordIndex] = entry.resolveResult
}
}
@@ -606,22 +635,25 @@ func parseFile(args parseArgs) {
tracker := logger.MakeLineColumnTracker(&source)
if path, contents := extractSourceMapFromComment(args.log, args.fs, &args.caches.FSCache,
- &source, &tracker, sourceMapComment, absResolveDir); contents != nil {
- prettyPath := resolver.PrettyPath(args.fs, path)
+ &source, &tracker, sourceMapComment, absResolveDir, args.options.LogPathStyle); contents != nil {
+ prettyPaths := resolver.MakePrettyPaths(args.fs, path)
log := logger.NewDeferLog(logger.DeferLogNoVerboseOrDebug, args.log.Overrides)
sourceMap := js_parser.ParseSourceMap(log, logger.Source{
- KeyPath: path,
- PrettyPath: prettyPath,
- Contents: *contents,
+ KeyPath: path,
+ PrettyPaths: prettyPaths,
+ Contents: *contents,
})
if msgs := log.Done(); len(msgs) > 0 {
var text string
if path.Namespace == "file" {
- text = fmt.Sprintf("The source map %q was referenced by the file %q here:", prettyPath, args.prettyPath)
+ text = fmt.Sprintf("The source map %q was referenced by the file %q here:",
+ prettyPaths.Select(args.options.LogPathStyle),
+ args.prettyPaths.Select(args.options.LogPathStyle))
} else {
- text = fmt.Sprintf("This source map came from the file %q here:", args.prettyPath)
+ text = fmt.Sprintf("This source map came from the file %q here:",
+ args.prettyPaths.Select(args.options.LogPathStyle))
}
note := tracker.MsgData(sourceMapComment.Range, text)
for _, msg := range msgs {
@@ -714,7 +746,8 @@ func parseFile(args parseArgs) {
// This is not allowed because the import path would have to be rewritten,
// but import paths are not rewritten when bundling isn't enabled.
args.log.AddError(nil, logger.Range{},
- fmt.Sprintf("Cannot inject %q with the \"copy\" loader without bundling enabled", source.PrettyPath))
+ fmt.Sprintf("Cannot inject %q with the \"copy\" loader without bundling enabled",
+ source.PrettyPaths.Select(args.options.LogPathStyle)))
}
args.inject <- config.InjectedFile{
Source: source,
@@ -726,6 +759,30 @@ func parseFile(args parseArgs) {
args.results <- result
}
+func reportExplicitPhaseImport(
+ log logger.Log,
+ tracker *logger.LineColumnTracker,
+ r logger.Range,
+ phase ast.ImportPhase,
+ isExternal bool,
+ format config.Format,
+) {
+ var phaseText string
+ switch phase {
+ case ast.DeferPhase:
+ phaseText = "deferred"
+ case ast.SourcePhase:
+ phaseText = "source phase"
+ default:
+ return
+ }
+ if format != config.FormatESModule {
+ log.AddError(tracker, r, fmt.Sprintf("Bundling %s imports with the %q output format is not supported", phaseText, format.String()))
+ } else if !isExternal {
+ log.AddError(tracker, r, fmt.Sprintf("Bundling with %s imports is not supported unless they are external", phaseText))
+ }
+}
+
func ResolveFailureErrorTextSuggestionNotes(
res *resolver.Resolver,
path string,
@@ -734,8 +791,9 @@ func ResolveFailureErrorTextSuggestionNotes(
fs fs.FS,
absResolveDir string,
platform config.Platform,
- originatingFilePath string,
+ originatingFilePaths logger.PrettyPaths,
modifiedImportPath string,
+ logPathStyle logger.PathStyle,
) (text string, suggestion string, notes []logger.MsgData) {
if modifiedImportPath != "" {
text = fmt.Sprintf("Could not resolve %q (originally %q)", modifiedImportPath, path)
@@ -758,9 +816,10 @@ func ResolveFailureErrorTextSuggestionNotes(
}
if pluginName == "" && !fs.IsAbs(path) {
if query, _ := res.ProbeResolvePackageAsRelative(absResolveDir, path, kind); query != nil {
+ prettyPaths := resolver.MakePrettyPaths(fs, query.PathPair.Primary)
hint = fmt.Sprintf("Use the relative path %q to reference the file %q. "+
"Without the leading \"./\", the path %q is being interpreted as a package path instead.",
- "./"+path, resolver.PrettyPath(fs, query.PathPair.Primary), path)
+ "./"+path, prettyPaths.Select(logPathStyle), path)
suggestion = string(helpers.QuoteForJSON("./"+path, false))
}
}
@@ -785,8 +844,8 @@ func ResolveFailureErrorTextSuggestionNotes(
if absResolveDir == "" && pluginName != "" {
where := ""
- if originatingFilePath != "" {
- where = fmt.Sprintf(" for the file %q", originatingFilePath)
+ if originatingFilePaths != (logger.PrettyPaths{}) {
+ where = fmt.Sprintf(" for the file %q", originatingFilePaths.Select(logPathStyle))
}
hint = fmt.Sprintf("The plugin %q didn't set a resolve directory%s, "+
"so esbuild did not search for %q on the file system.", pluginName, where, path)
@@ -832,6 +891,7 @@ func extractSourceMapFromComment(
tracker *logger.LineColumnTracker,
comment logger.Span,
absResolveDir string,
+ logPathStyle logger.PathStyle,
) (logger.Path, *string) {
// Support data URLs
if parsed, ok := resolver.ParseDataURL(comment.Text); ok {
@@ -893,8 +953,9 @@ func extractSourceMapFromComment(
fmt.Sprintf("Cannot read file: %s", absPath))
return logger.Path{}, nil
} else if err != nil {
+ prettyPaths := resolver.MakePrettyPaths(fs, path)
log.AddID(logger.MsgID_SourceMap_MissingSourceMap, logger.Warning, tracker, comment.Range,
- fmt.Sprintf("Cannot read file %q: %s", resolver.PrettyPath(fs, path), err.Error()))
+ fmt.Sprintf("Cannot read file %q: %s", prettyPaths.Select(logPathStyle), err.Error()))
return logger.Path{}, nil
} else {
return path, &contents
@@ -906,8 +967,8 @@ func sanitizeLocation(fs fs.FS, loc *logger.MsgLocation) {
if loc.Namespace == "" {
loc.Namespace = "file"
}
- if loc.File != "" {
- loc.File = resolver.PrettyPath(fs, logger.Path{Text: loc.File, Namespace: loc.Namespace})
+ if loc.File != (logger.PrettyPaths{}) {
+ loc.File = resolver.MakePrettyPaths(fs, logger.Path{Text: loc.File.Abs, Namespace: loc.Namespace})
}
}
}
@@ -942,8 +1003,8 @@ func logPluginMessages(
} else {
sanitizeLocation(fs, msg.Data.Location)
if importSource != nil {
- if msg.Data.Location.File == "" {
- msg.Data.Location.File = importSource.PrettyPath
+ if msg.Data.Location.File == (logger.PrettyPaths{}) {
+ msg.Data.Location.File = importSource.PrettyPaths
}
msg.Notes = append(msg.Notes, tracker.MsgData(importPathRange,
fmt.Sprintf("The plugin %q was triggered by this import", name)))
@@ -985,6 +1046,7 @@ func RunOnResolvePlugins(
kind ast.ImportKind,
absResolveDir string,
pluginData interface{},
+ logPathStyle logger.PathStyle,
) (*resolver.ResolveResult, bool, resolver.DebugMeta) {
resolverArgs := config.OnResolveArgs{
Path: path,
@@ -1081,10 +1143,12 @@ func RunOnResolvePlugins(
// Warn when the case used for importing differs from the actual file name
if result != nil && result.DifferentCase != nil && !helpers.IsInsideNodeModules(absResolveDir) {
diffCase := *result.DifferentCase
+ actualPaths := resolver.MakePrettyPaths(fs, logger.Path{Text: fs.Join(diffCase.Dir, diffCase.Actual), Namespace: "file"})
+ queryPaths := resolver.MakePrettyPaths(fs, logger.Path{Text: fs.Join(diffCase.Dir, diffCase.Query), Namespace: "file"})
log.AddID(logger.MsgID_Bundler_DifferentPathCase, logger.Warning, &tracker, importPathRange, fmt.Sprintf(
"Use %q instead of %q to avoid issues with case-sensitive file systems",
- resolver.PrettyPath(fs, logger.Path{Text: fs.Join(diffCase.Dir, diffCase.Actual), Namespace: "file"}),
- resolver.PrettyPath(fs, logger.Path{Text: fs.Join(diffCase.Dir, diffCase.Query), Namespace: "file"}),
+ actualPaths.Select(logPathStyle),
+ queryPaths.Select(logPathStyle),
))
}
@@ -1108,6 +1172,7 @@ func runOnLoadPlugins(
importPathRange logger.Range,
pluginData interface{},
isWatchMode bool,
+ logPathStyle logger.PathStyle,
) (loaderPluginResult, bool) {
loaderArgs := config.OnLoadArgs{
Path: source.KeyPath,
@@ -1191,8 +1256,9 @@ func runOnLoadPlugins(
fmt.Sprintf("Cannot read file: %s", source.KeyPath.Text))
return loaderPluginResult{}, false
} else {
+ prettyPaths := resolver.MakePrettyPaths(fs, source.KeyPath)
log.AddError(&tracker, importPathRange,
- fmt.Sprintf("Cannot read file %q: %s", resolver.PrettyPath(fs, source.KeyPath), err.Error()))
+ fmt.Sprintf("Cannot read file %q: %s", prettyPaths.Select(logPathStyle), err.Error()))
return loaderPluginResult{}, false
}
}
@@ -1456,7 +1522,7 @@ const (
// This returns the source index of the resulting file
func (s *scanner) maybeParseFile(
resolveResult resolver.ResolveResult,
- prettyPath string,
+ prettyPaths logger.PrettyPaths,
importSource *logger.Source,
importPathRange logger.Range,
importWith *ast.ImportAssertOrWith,
@@ -1528,7 +1594,7 @@ func (s *scanner) maybeParseFile(
// Special-case pretty-printed paths for data URLs
if path.Namespace == "dataurl" {
if _, ok := resolver.ParseDataURL(path.Text); ok {
- prettyPath = path.Text
+ prettyPath := path.Text
if len(prettyPath) > 65 {
prettyPath = prettyPath[:65]
}
@@ -1537,6 +1603,8 @@ func (s *scanner) maybeParseFile(
prettyPath = prettyPath[:64] + "..."
}
prettyPath = fmt.Sprintf("<%s>", prettyPath)
+ prettyPaths.Abs = prettyPath
+ prettyPaths.Rel = prettyPath
}
}
@@ -1552,7 +1620,7 @@ func (s *scanner) maybeParseFile(
res: s.res,
caches: s.caches,
keyPath: path,
- prettyPath: prettyPath,
+ prettyPaths: prettyPaths,
sourceIndex: visited.sourceIndex,
importSource: importSource,
sideEffects: sideEffects,
@@ -1625,7 +1693,7 @@ func (s *scanner) preprocessInjectedFiles() {
source := logger.Source{
Index: sourceIndex,
KeyPath: visitedKey,
- PrettyPath: resolver.PrettyPath(s.fs, visitedKey),
+ PrettyPaths: resolver.MakePrettyPaths(s.fs, visitedKey),
IdentifierName: js_ast.EnsureValidIdentifier(visitedKey.Text),
Contents: define.Source.Contents,
}
@@ -1704,6 +1772,7 @@ func (s *scanner) preprocessInjectedFiles() {
ast.ImportEntryPoint,
injectAbsResolveDir,
nil,
+ s.options.LogPathStyle,
)
if resolveResult != nil {
if resolveResult.PathPair.IsExternal {
@@ -1730,7 +1799,7 @@ func (s *scanner) preprocessInjectedFiles() {
for _, resolveResult := range injectResolveResults {
if resolveResult != nil {
channel := make(chan config.InjectedFile, 1)
- s.maybeParseFile(*resolveResult, resolver.PrettyPath(s.fs, resolveResult.PathPair.Primary), nil, logger.Range{}, nil, inputKindNormal, channel)
+ s.maybeParseFile(*resolveResult, resolver.MakePrettyPaths(s.fs, resolveResult.PathPair.Primary), nil, logger.Range{}, nil, inputKindNormal, channel)
injectWaitGroup.Add(1)
// Wait for the results in parallel. The results slice is large enough so
@@ -1773,7 +1842,7 @@ func (s *scanner) addEntryPoints(entryPoints []EntryPoint) []graph.EntryPoint {
}
}
resolveResult := resolver.ResolveResult{PathPair: resolver.PathPair{Primary: stdinPath}}
- sourceIndex := s.maybeParseFile(resolveResult, resolver.PrettyPath(s.fs, stdinPath), nil, logger.Range{}, nil, inputKindStdin, nil)
+ sourceIndex := s.maybeParseFile(resolveResult, resolver.MakePrettyPaths(s.fs, stdinPath), nil, logger.Range{}, nil, inputKindStdin, nil)
entryMetas = append(entryMetas, graph.EntryPoint{
OutputPath: "stdin",
SourceIndex: sourceIndex,
@@ -1884,6 +1953,7 @@ func (s *scanner) addEntryPoints(entryPoints []EntryPoint) []graph.EntryPoint {
ast.ImportEntryPoint,
entryPointAbsResolveDir,
nil,
+ s.options.LogPathStyle,
)
if resolveResult != nil {
if resolveResult.PathPair.IsExternal {
@@ -1895,10 +1965,11 @@ func (s *scanner) addEntryPoints(entryPoints []EntryPoint) []graph.EntryPoint {
var notes []logger.MsgData
if !s.fs.IsAbs(entryPoint.InputPath) {
if query, _ := s.res.ProbeResolvePackageAsRelative(entryPointAbsResolveDir, entryPoint.InputPath, ast.ImportEntryPoint); query != nil {
+ prettyPaths := resolver.MakePrettyPaths(s.fs, query.PathPair.Primary)
notes = append(notes, logger.MsgData{
Text: fmt.Sprintf("Use the relative path %q to reference the file %q. "+
"Without the leading \"./\", the path %q is being interpreted as a package path instead.",
- "./"+entryPoint.InputPath, resolver.PrettyPath(s.fs, query.PathPair.Primary), entryPoint.InputPath),
+ "./"+entryPoint.InputPath, prettyPaths.Select(s.options.LogPathStyle), entryPoint.InputPath),
})
}
}
@@ -1926,14 +1997,14 @@ func (s *scanner) addEntryPoints(entryPoints []EntryPoint) []graph.EntryPoint {
for _, resolveResult := range info.results {
resolveResult := resolveResult
- prettyPath := resolver.PrettyPath(s.fs, resolveResult.PathPair.Primary)
+ prettyPaths := resolver.MakePrettyPaths(s.fs, resolveResult.PathPair.Primary)
outputPath := entryPoints[i].OutputPath
outputPathWasAutoGenerated := false
// If the output path is missing, automatically generate one from the input path
if outputPath == "" {
if info.isGlob {
- outputPath = prettyPath
+ outputPath = prettyPaths.Rel
} else {
outputPath = entryPoints[i].InputPath
}
@@ -1965,7 +2036,7 @@ func (s *scanner) addEntryPoints(entryPoints []EntryPoint) []graph.EntryPoint {
entryPointsToParse = append(entryPointsToParse, entryPointToParse{
index: len(entryMetas),
parse: func() uint32 {
- return s.maybeParseFile(resolveResult, prettyPath, nil, logger.Range{}, nil, inputKindEntryPoint, nil)
+ return s.maybeParseFile(resolveResult, prettyPaths, nil, logger.Range{}, nil, inputKindEntryPoint, nil)
},
})
@@ -2120,7 +2191,7 @@ func (s *scanner) scanAllDependencies() {
sourceIndex := s.allocateGlobSourceIndex(result.file.inputFile.Source.Index, uint32(importRecordIndex))
record.SourceIndex = ast.MakeIndex32(sourceIndex)
s.results[sourceIndex] = s.generateResultForGlobResolve(sourceIndex, globResults.absPath,
- &result.file.inputFile.Source, record.Range, with, record.GlobPattern.Kind, globResults, record.AssertOrWith)
+ &result.file.inputFile.Source, record.Range, with, record.GlobPattern.Kind, record.Phase, globResults, record.AssertOrWith)
}
continue
}
@@ -2128,7 +2199,7 @@ func (s *scanner) scanAllDependencies() {
path := resolveResult.PathPair.Primary
if !resolveResult.PathPair.IsExternal {
// Handle a path within the bundle
- sourceIndex := s.maybeParseFile(*resolveResult, resolver.PrettyPath(s.fs, path),
+ sourceIndex := s.maybeParseFile(*resolveResult, resolver.MakePrettyPaths(s.fs, path),
&result.file.inputFile.Source, record.Range, with, inputKindNormal, nil)
record.SourceIndex = ast.MakeIndex32(sourceIndex)
} else {
@@ -2168,6 +2239,7 @@ func (s *scanner) generateResultForGlobResolve(
importRange logger.Range,
importWith *ast.ImportAssertOrWith,
kind ast.ImportKind,
+ phase ast.ImportPhase,
result globResolveResult,
assertions *ast.ImportAssertOrWith,
) parseResult {
@@ -2191,7 +2263,7 @@ func (s *scanner) generateResultForGlobResolve(
if !resolveResult.PathPair.IsExternal {
sourceIndex = ast.MakeIndex32(s.maybeParseFile(
resolveResult,
- resolver.PrettyPath(s.fs, resolveResult.PathPair.Primary),
+ resolver.MakePrettyPaths(s.fs, resolveResult.PathPair.Primary),
importSource,
importRange,
importWith,
@@ -2221,6 +2293,7 @@ func (s *scanner) generateResultForGlobResolve(
SourceIndex: sourceIndex,
AssertOrWith: assertions,
Kind: kind,
+ Phase: phase,
})
switch kind {
@@ -2242,9 +2315,9 @@ func (s *scanner) generateResultForGlobResolve(
}
source := logger.Source{
- KeyPath: logger.Path{Text: fakeSourcePath, Namespace: "file"},
- PrettyPath: result.prettyPath,
- Index: sourceIndex,
+ KeyPath: logger.Path{Text: fakeSourcePath, Namespace: "file"},
+ PrettyPaths: result.prettyPaths,
+ Index: sourceIndex,
}
ast := js_parser.GlobResolveAST(s.log, source, importRecords, &object, result.exportAlias)
@@ -2279,11 +2352,11 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
}
// Check for pretty-printed path collisions
- importAttributeNameCollisions := make(map[string][]uint32)
+ importAttributeNameCollisions := make(map[logger.PrettyPaths][]uint32)
for sourceIndex := range s.results {
if result := &s.results[sourceIndex]; result.ok {
- prettyPath := result.file.inputFile.Source.PrettyPath
- importAttributeNameCollisions[prettyPath] = append(importAttributeNameCollisions[prettyPath], uint32(sourceIndex))
+ prettyPaths := result.file.inputFile.Source.PrettyPaths
+ importAttributeNameCollisions[prettyPaths] = append(importAttributeNameCollisions[prettyPaths], uint32(sourceIndex))
}
}
@@ -2319,7 +2392,9 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
sb.Write(helpers.QuoteSingle(attr.Value, false))
}
sb.WriteString(" }")
- source.PrettyPath += sb.String()
+ suffix := sb.String()
+ source.PrettyPaths.Abs += suffix
+ source.PrettyPaths.Rel += suffix
}
}
@@ -2334,7 +2409,7 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
// Begin the metadata chunk
if s.options.NeedsMetafile {
- sb.Write(helpers.QuoteForJSON(result.file.inputFile.Source.PrettyPath, s.options.ASCIIOnly))
+ sb.Write(helpers.QuoteForJSON(result.file.inputFile.Source.PrettyPaths.Select(s.options.MetafilePathStyle), s.options.ASCIIOnly))
sb.WriteString(fmt.Sprintf(": {\n \"bytes\": %d,\n \"imports\": [", len(result.file.inputFile.Source.Contents)))
}
@@ -2414,7 +2489,7 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
sb.WriteString(",\n ")
}
sb.WriteString(fmt.Sprintf("{\n \"path\": %s,\n \"kind\": %s,\n \"original\": %s%s\n }",
- helpers.QuoteForJSON(otherFile.inputFile.Source.PrettyPath, s.options.ASCIIOnly),
+ helpers.QuoteForJSON(otherFile.inputFile.Source.PrettyPaths.Select(s.options.MetafilePathStyle), s.options.ASCIIOnly),
helpers.QuoteForJSON(record.Kind.StringForMetafile(), s.options.ASCIIOnly),
helpers.QuoteForJSON(record.Path.Text, s.options.ASCIIOnly),
metafileWith))
@@ -2428,7 +2503,9 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
// runtime evaluates them, not us).
if record.Flags.Has(ast.AssertTypeJSON) && otherResult.ok && otherFile.inputFile.Loader != config.LoaderJSON && otherFile.inputFile.Loader != config.LoaderCopy {
s.log.AddErrorWithNotes(&tracker, record.Range,
- fmt.Sprintf("The file %q was loaded with the %q loader", otherFile.inputFile.Source.PrettyPath, config.LoaderToString[otherFile.inputFile.Loader]),
+ fmt.Sprintf("The file %q was loaded with the %q loader",
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle),
+ config.LoaderToString[otherFile.inputFile.Loader]),
[]logger.MsgData{
tracker.MsgData(js_lexer.RangeOfImportAssertOrWith(result.file.inputFile.Source,
*ast.FindAssertOrWithEntry(record.AssertOrWith.Entries, "type"), js_lexer.KeyAndValueRange),
@@ -2441,20 +2518,24 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
// Using a JavaScript file with CSS "composes" is not allowed
if _, ok := otherFile.inputFile.Repr.(*graph.JSRepr); ok && otherFile.inputFile.Loader != config.LoaderEmpty {
s.log.AddErrorWithNotes(&tracker, record.Range,
- fmt.Sprintf("Cannot use \"composes\" with %q", otherFile.inputFile.Source.PrettyPath),
+ fmt.Sprintf("Cannot use \"composes\" with %q",
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle)),
[]logger.MsgData{{Text: fmt.Sprintf(
"You can only use \"composes\" with CSS files and %q is not a CSS file (it was loaded with the %q loader).",
- otherFile.inputFile.Source.PrettyPath, config.LoaderToString[otherFile.inputFile.Loader])}})
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle),
+ config.LoaderToString[otherFile.inputFile.Loader])}})
}
case ast.ImportAt:
// Using a JavaScript file with CSS "@import" is not allowed
if _, ok := otherFile.inputFile.Repr.(*graph.JSRepr); ok && otherFile.inputFile.Loader != config.LoaderEmpty {
s.log.AddErrorWithNotes(&tracker, record.Range,
- fmt.Sprintf("Cannot import %q into a CSS file", otherFile.inputFile.Source.PrettyPath),
+ fmt.Sprintf("Cannot import %q into a CSS file",
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle)),
[]logger.MsgData{{Text: fmt.Sprintf(
"An \"@import\" rule can only be used to import another CSS file and %q is not a CSS file (it was loaded with the %q loader).",
- otherFile.inputFile.Source.PrettyPath, config.LoaderToString[otherFile.inputFile.Loader])}})
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle),
+ config.LoaderToString[otherFile.inputFile.Loader])}})
}
case ast.ImportURL:
@@ -2462,18 +2543,22 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
switch otherRepr := otherFile.inputFile.Repr.(type) {
case *graph.CSSRepr:
s.log.AddErrorWithNotes(&tracker, record.Range,
- fmt.Sprintf("Cannot use %q as a URL", otherFile.inputFile.Source.PrettyPath),
+ fmt.Sprintf("Cannot use %q as a URL",
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle)),
[]logger.MsgData{{Text: fmt.Sprintf(
"You can't use a \"url()\" token to reference a CSS file, and %q is a CSS file (it was loaded with the %q loader).",
- otherFile.inputFile.Source.PrettyPath, config.LoaderToString[otherFile.inputFile.Loader])}})
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle),
+ config.LoaderToString[otherFile.inputFile.Loader])}})
case *graph.JSRepr:
if otherRepr.AST.URLForCSS == "" && otherFile.inputFile.Loader != config.LoaderEmpty {
s.log.AddErrorWithNotes(&tracker, record.Range,
- fmt.Sprintf("Cannot use %q as a URL", otherFile.inputFile.Source.PrettyPath),
+ fmt.Sprintf("Cannot use %q as a URL",
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle)),
[]logger.MsgData{{Text: fmt.Sprintf(
"You can't use a \"url()\" token to reference the file %q because it was loaded with the %q loader, which doesn't provide a URL to embed in the resulting CSS.",
- otherFile.inputFile.Source.PrettyPath, config.LoaderToString[otherFile.inputFile.Loader])}})
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle),
+ config.LoaderToString[otherFile.inputFile.Loader])}})
}
}
}
@@ -2493,7 +2578,8 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
if css, ok := otherFile.inputFile.Repr.(*graph.CSSRepr); ok {
if s.options.WriteToStdout {
s.log.AddError(&tracker, record.Range,
- fmt.Sprintf("Cannot import %q into a JavaScript file without an output path configured", otherFile.inputFile.Source.PrettyPath))
+ fmt.Sprintf("Cannot import %q into a JavaScript file without an output path configured",
+ otherFile.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle)))
} else if !css.JSSourceIndex.IsValid() {
stubKey := otherFile.inputFile.Source.KeyPath
if stubKey.Namespace == "file" {
@@ -2565,7 +2651,7 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
}
s.log.AddIDWithNotes(logger.MsgID_Bundler_IgnoredBareImport, logger.Warning, &tracker, record.Range,
fmt.Sprintf("Ignoring this import because %q was marked as having no side effects%s",
- otherModule.Source.PrettyPath, by), notes)
+ otherModule.Source.PrettyPaths.Select(s.options.LogPathStyle), by), notes)
}
}
}
@@ -2669,13 +2755,13 @@ func (s *scanner) processScannedFiles(entryPointMeta []graph.EntryPoint) []scann
var jsonMetadataChunk string
if s.options.NeedsMetafile {
inputs := fmt.Sprintf("{\n %s: {\n \"bytesInOutput\": %d\n }\n }",
- helpers.QuoteForJSON(result.file.inputFile.Source.PrettyPath, s.options.ASCIIOnly),
+ helpers.QuoteForJSON(result.file.inputFile.Source.PrettyPaths.Select(s.options.MetafilePathStyle), s.options.ASCIIOnly),
len(bytes),
)
entryPointJSON := ""
if isEntryPoint {
entryPointJSON = fmt.Sprintf("\"entryPoint\": %s,\n ",
- helpers.QuoteForJSON(result.file.inputFile.Source.PrettyPath, s.options.ASCIIOnly))
+ helpers.QuoteForJSON(result.file.inputFile.Source.PrettyPaths.Select(s.options.MetafilePathStyle), s.options.ASCIIOnly))
}
jsonMetadataChunk = fmt.Sprintf(
"{\n \"imports\": [],\n \"exports\": [],\n %s\"inputs\": %s,\n \"bytes\": %d\n }",
@@ -2738,7 +2824,7 @@ func (s *scanner) validateTLA(sourceIndex uint32) tlaCheck {
// Require of a top-level await chain is forbidden
if record.Kind == ast.ImportRequire {
var notes []logger.MsgData
- var tlaPrettyPath string
+ var tlaPrettyPaths logger.PrettyPaths
otherSourceIndex := record.SourceIndex.GetIndex()
// Build up a chain of relevant notes for all of the imports
@@ -2747,10 +2833,11 @@ func (s *scanner) validateTLA(sourceIndex uint32) tlaCheck {
parentRepr := parentResult.file.inputFile.Repr.(*graph.JSRepr)
if parentRepr.AST.LiveTopLevelAwaitKeyword.Len > 0 {
- tlaPrettyPath = parentResult.file.inputFile.Source.PrettyPath
+ tlaPrettyPaths = parentResult.file.inputFile.Source.PrettyPaths
tracker := logger.MakeLineColumnTracker(&parentResult.file.inputFile.Source)
notes = append(notes, tracker.MsgData(parentRepr.AST.LiveTopLevelAwaitKeyword,
- fmt.Sprintf("The top-level await in %q is here:", tlaPrettyPath)))
+ fmt.Sprintf("The top-level await in %q is here:",
+ tlaPrettyPaths.Select(s.options.LogPathStyle))))
break
}
@@ -2765,18 +2852,19 @@ func (s *scanner) validateTLA(sourceIndex uint32) tlaCheck {
notes = append(notes, tracker.MsgData(
parentRepr.AST.ImportRecords[parentResult.tlaCheck.importRecordIndex].Range,
fmt.Sprintf("The file %q imports the file %q here:",
- parentResult.file.inputFile.Source.PrettyPath, s.results[otherSourceIndex].file.inputFile.Source.PrettyPath)))
+ parentResult.file.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle),
+ s.results[otherSourceIndex].file.inputFile.Source.PrettyPaths.Select(s.options.LogPathStyle))))
}
var text string
- importedPrettyPath := s.results[record.SourceIndex.GetIndex()].file.inputFile.Source.PrettyPath
+ importedPrettyPaths := s.results[record.SourceIndex.GetIndex()].file.inputFile.Source.PrettyPaths
- if importedPrettyPath == tlaPrettyPath {
+ if importedPrettyPaths == tlaPrettyPaths {
text = fmt.Sprintf("This require call is not allowed because the imported file %q contains a top-level await",
- importedPrettyPath)
+ importedPrettyPaths.Select(s.options.LogPathStyle))
} else {
text = fmt.Sprintf("This require call is not allowed because the transitive dependency %q contains a top-level await",
- tlaPrettyPath)
+ tlaPrettyPaths.Select(s.options.LogPathStyle))
}
tracker := logger.MakeLineColumnTracker(&result.file.inputFile.Source)
@@ -3001,7 +3089,7 @@ func (b *Bundle) Compile(log logger.Log, timer *helpers.Timer, mangleCache map[s
}
log.AddError(nil, logger.Range{},
fmt.Sprintf("Refusing to overwrite input file %q%s",
- b.files[sourceIndex].inputFile.Source.PrettyPath, hint))
+ b.files[sourceIndex].inputFile.Source.PrettyPaths.Select(options.LogPathStyle), hint))
}
}
}
@@ -3185,11 +3273,12 @@ func (b *Bundle) generateMetadataJSON(results []graph.OutputFile, allReachableFi
// Write outputs
isFirst = true
- paths := make(map[string]bool)
+ pathMap := make(map[string]struct{})
for _, result := range results {
if len(result.JSONMetadataChunk) > 0 {
- path := resolver.PrettyPath(b.fs, logger.Path{Text: result.AbsPath, Namespace: "file"})
- if paths[path] {
+ prettyPaths := resolver.MakePrettyPaths(b.fs, logger.Path{Text: result.AbsPath, Namespace: "file"})
+ path := prettyPaths.Select(b.options.MetafilePathStyle)
+ if _, ok := pathMap[path]; ok {
// Don't write out the same path twice (can happen with the "file" loader)
continue
}
@@ -3199,7 +3288,7 @@ func (b *Bundle) generateMetadataJSON(results []graph.OutputFile, allReachableFi
} else {
sb.WriteString(",\n ")
}
- paths[path] = true
+ pathMap[path] = struct{}{}
sb.WriteString(fmt.Sprintf("%s: ", helpers.QuoteForJSON(path, asciiOnly)))
sb.WriteString(result.JSONMetadataChunk)
}
diff --git internal/bundler_tests/bundler_dce_test.go internal/bundler_tests/bundler_dce_test.go
index 522bfeef810..f015f1f7d87 100644
--- internal/bundler_tests/bundler_dce_test.go
+++ internal/bundler_tests/bundler_dce_test.go
@@ -2859,7 +2859,7 @@ func TestConstValueInliningNoBundle(t *testing.T) {
const u_keep = undefined
const i_keep = 1234567
const f_keep = 123.456
- const s_keep = ''
+ const s_keep = 'abc'
// Values should still be inlined
console.log(
@@ -2877,13 +2877,15 @@ func TestConstValueInliningNoBundle(t *testing.T) {
const REMOVE_u = undefined
const REMOVE_i = 1234567
const REMOVE_f = 123.456
- const s_keep = '' // String inlining is intentionally not supported right now
+ const REMOVE_s = 'abc' // String inlining is intentionally not supported right now
+ const s_keep = 'Long strings are not inlined as constants'
console.log(
// These are doubled to avoid the "inline const/let into next statement if used once" optimization
REMOVE_n, REMOVE_n,
REMOVE_u, REMOVE_u,
REMOVE_i, REMOVE_i,
REMOVE_f, REMOVE_f,
+ REMOVE_s, REMOVE_s,
s_keep, s_keep,
)
}
@@ -2894,13 +2896,15 @@ func TestConstValueInliningNoBundle(t *testing.T) {
const REMOVE_u = undefined
const REMOVE_i = 1234567
const REMOVE_f = 123.456
- const s_keep = '' // String inlining is intentionally not supported right now
+ const REMOVE_s = 'abc' // String inlining is intentionally not supported right now
+ const s_keep = 'Long strings are not inlined as constants'
console.log(
// These are doubled to avoid the "inline const/let into next statement if used once" optimization
REMOVE_n, REMOVE_n,
REMOVE_u, REMOVE_u,
REMOVE_i, REMOVE_i,
REMOVE_f, REMOVE_f,
+ REMOVE_s, REMOVE_s,
s_keep, s_keep,
)
}
@@ -3476,6 +3480,63 @@ func TestCrossModuleConstantFoldingString(t *testing.T) {
})
}
+func TestCrossModuleConstantFoldingComputedPropertyName(t *testing.T) {
+ dce_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/enum-constants.ts": `
+ export enum x {
+ a = 123,
+ b = 'abc',
+ proto = '__proto__',
+ ptype = 'prototype',
+ ctor = 'constructor',
+ }
+ `,
+ "/enum-entry.ts": `
+ import { x } from './enum-constants'
+ console.log({
+ [x.a]: x.a,
+ [x.b]: x.b,
+ })
+ class Foo {
+ [x.proto] = {};
+ [x.ptype] = {};
+ [x.ctor]() {};
+ }
+ `,
+
+ "/const-constants.js": `
+ export const a = 456
+ export const b = 'xyz'
+ export const proto = '__proto__'
+ export const ptype = 'prototype'
+ export const ctor = 'constructor'
+ `,
+ "/const-entry.js": `
+ import { a, b, proto, ptype, ctor } from './const-constants'
+ console.log({
+ [a]: a,
+ [b]: b,
+ })
+ class Foo {
+ [proto] = {};
+ [ptype] = {};
+ [ctor]() {};
+ }
+ `,
+ },
+ entryPaths: []string{
+ "/enum-entry.ts",
+ "/const-entry.js",
+ },
+ options: config.Options{
+ Mode: config.ModeBundle,
+ AbsOutputDir: "/out",
+ MinifySyntax: true,
+ },
+ })
+}
+
func TestMultipleDeclarationTreeShaking(t *testing.T) {
dce_suite.expectBundled(t, bundled{
files: map[string]string{
diff --git a/internal/bundler_tests/bundler_importphase_test.go b/internal/bundler_tests/bundler_importphase_test.go
new file mode 100644
index 00000000000..4a75f513499
--- /dev/null
+++ internal/bundler_tests/bundler_importphase_test.go
@@ -0,0 +1,435 @@
+package bundler_tests
+
+import (
+ "testing"
+
+ "github.com/evanw/esbuild/internal/config"
+)
+
+var importphase_suite = suite{
+ name: "importphase",
+}
+
+func TestImportDeferExternalESM(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import defer * as foo0 from './foo.json'
+ import defer * as foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.defer('./foo.json'),
+ import.defer('./foo.json', { with: { type: 'json' } }),
+ import.defer(` + "`./${foo}.json`" + `),
+ import.defer(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatESModule,
+ AbsOutputFile: "/out.js",
+ ExternalSettings: config.ExternalSettings{
+ PreResolve: config.ExternalMatchers{
+ Patterns: []config.WildcardPattern{{Suffix: ".json"}},
+ },
+ },
+ },
+ })
+}
+
+func TestImportDeferExternalCommonJS(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import defer * as foo0 from './foo.json'
+ import defer * as foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.defer('./foo.json'),
+ import.defer('./foo.json', { with: { type: 'json' } }),
+ import.defer(` + "`./${foo}.json`" + `),
+ import.defer(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatCommonJS,
+ AbsOutputFile: "/out.js",
+ ExternalSettings: config.ExternalSettings{
+ PreResolve: config.ExternalMatchers{
+ Patterns: []config.WildcardPattern{{Suffix: ".json"}},
+ },
+ },
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+`,
+ })
+}
+
+func TestImportDeferExternalIIFE(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import defer * as foo0 from './foo.json'
+ import defer * as foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.defer('./foo.json'),
+ import.defer('./foo.json', { with: { type: 'json' } }),
+ import.defer(` + "`./${foo}.json`" + `),
+ import.defer(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatIIFE,
+ AbsOutputFile: "/out.js",
+ ExternalSettings: config.ExternalSettings{
+ PreResolve: config.ExternalMatchers{
+ Patterns: []config.WildcardPattern{{Suffix: ".json"}},
+ },
+ },
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+`,
+ })
+}
+
+func TestImportDeferInternalESM(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import defer * as foo0 from './foo.json'
+ import defer * as foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.defer('./foo.json'),
+ import.defer('./foo.json', { with: { type: 'json' } }),
+ import.defer(` + "`./${foo}.json`" + `),
+ import.defer(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatESModule,
+ AbsOutputFile: "/out.js",
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling with deferred imports is not supported unless they are external
+entry.js: ERROR: Bundling with deferred imports is not supported unless they are external
+entry.js: ERROR: Bundling with deferred imports is not supported unless they are external
+entry.js: ERROR: Bundling with deferred imports is not supported unless they are external
+entry.js: ERROR: Bundling with deferred imports is not supported unless they are external
+entry.js: ERROR: Bundling with deferred imports is not supported unless they are external
+`,
+ })
+}
+
+func TestImportDeferInternalCommonJS(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import defer * as foo0 from './foo.json'
+ import defer * as foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.defer('./foo.json'),
+ import.defer('./foo.json', { with: { type: 'json' } }),
+ import.defer(` + "`./${foo}.json`" + `),
+ import.defer(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatCommonJS,
+ AbsOutputFile: "/out.js",
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "cjs" output format is not supported
+`,
+ })
+}
+
+func TestImportDeferInternalIIFE(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import defer * as foo0 from './foo.json'
+ import defer * as foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.defer('./foo.json'),
+ import.defer('./foo.json', { with: { type: 'json' } }),
+ import.defer(` + "`./${foo}.json`" + `),
+ import.defer(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatIIFE,
+ AbsOutputFile: "/out.js",
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling deferred imports with the "iife" output format is not supported
+`,
+ })
+}
+
+func TestImportSourceExternalESM(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import source foo0 from './foo.json'
+ import source foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.source('./foo.json'),
+ import.source('./foo.json', { with: { type: 'json' } }),
+ import.source(` + "`./${foo}.json`" + `),
+ import.source(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatESModule,
+ AbsOutputFile: "/out.js",
+ ExternalSettings: config.ExternalSettings{
+ PreResolve: config.ExternalMatchers{
+ Patterns: []config.WildcardPattern{{Suffix: ".json"}},
+ },
+ },
+ },
+ })
+}
+
+func TestImportSourceExternalCommonJS(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import source foo0 from './foo.json'
+ import source foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.source('./foo.json'),
+ import.source('./foo.json', { with: { type: 'json' } }),
+ import.source(` + "`./${foo}.json`" + `),
+ import.source(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatCommonJS,
+ AbsOutputFile: "/out.js",
+ ExternalSettings: config.ExternalSettings{
+ PreResolve: config.ExternalMatchers{
+ Patterns: []config.WildcardPattern{{Suffix: ".json"}},
+ },
+ },
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+`,
+ })
+}
+
+func TestImportSourceExternalIIFE(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import source foo0 from './foo.json'
+ import source foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.source('./foo.json'),
+ import.source('./foo.json', { with: { type: 'json' } }),
+ import.source(` + "`./${foo}.json`" + `),
+ import.source(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatIIFE,
+ AbsOutputFile: "/out.js",
+ ExternalSettings: config.ExternalSettings{
+ PreResolve: config.ExternalMatchers{
+ Patterns: []config.WildcardPattern{{Suffix: ".json"}},
+ },
+ },
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+`,
+ })
+}
+
+func TestImportSourceInternalESM(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import source foo0 from './foo.json'
+ import source foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.source('./foo.json'),
+ import.source('./foo.json', { with: { type: 'json' } }),
+ import.source(` + "`./${foo}.json`" + `),
+ import.source(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatESModule,
+ AbsOutputFile: "/out.js",
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling with source phase imports is not supported unless they are external
+entry.js: ERROR: Bundling with source phase imports is not supported unless they are external
+entry.js: ERROR: Bundling with source phase imports is not supported unless they are external
+entry.js: ERROR: Bundling with source phase imports is not supported unless they are external
+entry.js: ERROR: Bundling with source phase imports is not supported unless they are external
+entry.js: ERROR: Bundling with source phase imports is not supported unless they are external
+`,
+ })
+}
+
+func TestImportSourceInternalCommonJS(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import source foo0 from './foo.json'
+ import source foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.source('./foo.json'),
+ import.source('./foo.json', { with: { type: 'json' } }),
+ import.source(` + "`./${foo}.json`" + `),
+ import.source(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatCommonJS,
+ AbsOutputFile: "/out.js",
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "cjs" output format is not supported
+`,
+ })
+}
+
+func TestImportSourceInternalIIFE(t *testing.T) {
+ importphase_suite.expectBundled(t, bundled{
+ files: map[string]string{
+ "/entry.js": `
+ import source foo0 from './foo.json'
+ import source foo1 from './foo.json' with { type: 'json' }
+
+ console.log(
+ foo0,
+ foo1,
+ import.source('./foo.json'),
+ import.source('./foo.json', { with: { type: 'json' } }),
+ import.source(` + "`./${foo}.json`" + `),
+ import.source(` + "`./${foo}.json`" + `, { with: { type: 'json' } }),
+ )
+ `,
+ "/foo.json": `{}`,
+ },
+ entryPaths: []string{"/entry.js"},
+ options: config.Options{
+ Mode: config.ModeBundle,
+ OutputFormat: config.FormatIIFE,
+ AbsOutputFile: "/out.js",
+ },
+ expectedScanLog: `entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not, supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+entry.js: ERROR: Bundling source phase imports with the "iife" output format is not supported
+`,
+ })
+}
diff --git internal/bundler_tests/snapshots/snapshots_dce.txt internal/bundler_tests/snapshots/snapshots_dce.txt
index ccba2e2b43f..cfa5b93a60f 100644
--- internal/bundler_tests/snapshots/snapshots_dce.txt
+++ internal/bundler_tests/snapshots/snapshots_dce.txt
@@ -135,7 +135,7 @@ const variable = !1;
================================================================================
TestConstValueInliningNoBundle
---------- /out/top-level.js ----------
-const n_keep = null, u_keep = void 0, i_keep = 1234567, f_keep = 123.456, s_keep = "";
+const n_keep = null, u_keep = void 0, i_keep = 1234567, f_keep = 123.456, s_keep = "abc";
console.log(
// These are doubled to avoid the "inline const/let into next statement if used once" optimization
null,
@@ -146,13 +146,13 @@ console.log(
1234567,
123.456,
123.456,
- s_keep,
- s_keep
+ "abc",
+ "abc"
);
---------- /out/nested-block.js ----------
{
- const s_keep = "";
+ const s_keep = "Long strings are not inlined as constants";
console.log(
// These are doubled to avoid the "inline const/let into next statement if used once" optimization
null,
@@ -163,6 +163,8 @@ console.log(
1234567,
123.456,
123.456,
+ "abc",
+ "abc",
s_keep,
s_keep
);
@@ -170,7 +172,7 @@ console.log(
---------- /out/nested-function.js ----------
function nested() {
- const s_keep = "";
+ const s_keep = "Long strings are not inlined as constants";
console.log(
// These are doubled to avoid the "inline const/let into next statement if used once" optimization
null,
@@ -181,6 +183,8 @@ function nested() {
1234567,
123.456,
123.456,
+ "abc",
+ "abc",
s_keep,
s_keep
);
@@ -262,6 +266,37 @@ function foo() {
return f();
}
+================================================================================
+TestCrossModuleConstantFoldingComputedPropertyName
+---------- /out/enum-entry.js ----------
+// enum-entry.ts
+console.log({
+ 123: 123 /* a */,
+ abc: "abc" /* b */
+});
+var Foo = class {
+ ["__proto__"] = {};
+ ["prototype"] = {};
+ ["constructor"]() {
+ }
+};
+
+---------- /out/const-entry.js ----------
+// const-constants.js
+var proto = "__proto__", ptype = "prototype", ctor = "constructor";
+
+// const-entry.js
+console.log({
+ 456: 456,
+ xyz: "xyz"
+});
+var Foo = class {
+ [proto] = {};
+ [ptype] = {};
+ [ctor]() {
+ }
+};
+
================================================================================
TestCrossModuleConstantFoldingNumber
---------- /out/enum-entry.js ----------
@@ -275,7 +310,7 @@ console.log([
], [
9,
-3,
- 3 /* a */ * 6 /* b */,
+ 18,
3 /* a */ / 6 /* b */,
3 /* a */ % 6 /* b */,
3 /* a */ ** 6 /* b */
@@ -315,7 +350,7 @@ console.log([
], [
9,
-3,
- 3 * 6,
+ 18,
3 / 6,
3 % 6,
3 ** 6
@@ -377,38 +412,32 @@ console.log([
]);
---------- /out/const-entry.js ----------
-// const-constants.js
-var a = "foo", b = "bar";
-
// const-entry.js
console.log([
- typeof b
+ typeof "bar"
], [
- a + b
+ "foobar"
], [
- a < b,
- a > b,
- a <= b,
- a >= b,
- a == b,
- a != b,
- a === b,
- a !== b
+ !1,
+ !0,
+ !1,
+ !0,
+ !1,
+ !0,
+ !1,
+ !0
], [
- a && b,
- a || b,
- a ?? b,
- a ? "y" : "n",
- b ? "n" : "y"
+ "bar",
+ "foo",
+ "foo",
+ "y",
+ "n"
]);
---------- /out/nested-entry.js ----------
-// nested-constants.ts
-var a = "foo", b = "bar", c = "baz";
-
// nested-entry.ts
console.log({
- "should be foobarbaz": a + b + c,
+ "should be foobarbaz": "foobarbaz",
"should be FOOBARBAZ": "FOOBARBAZ"
});
diff --git a/internal/bundler_tests/snapshots/snapshots_importphase.txt b/internal/bundler_tests/snapshots/snapshots_importphase.txt
new file mode 100644
index 00000000000..b44f3d96dd0
--- /dev/null
+++ internal/bundler_tests/snapshots/snapshots_importphase.txt
@@ -0,0 +1,52 @@
+TestImportDeferExternalESM
+---------- /out.js ----------
+// entry.js
+import defer * as foo0 from "./foo.json";
+import defer * as foo1 from "./foo.json" with { type: "json" };
+
+// import.defer("./**/*.json") in entry.js
+var globImport_json = __glob({
+ "./foo.json": () => import.defer("./foo.json")
+});
+
+// import.defer("./**/*.json") in entry.js
+var globImport_json2 = __glob({
+ "./foo.json": () => import.defer("./foo.json", { with: { type: "json" } })
+});
+
+// entry.js
+console.log(
+ foo0,
+ foo1,
+ import.defer("./foo.json"),
+ import.defer("./foo.json", { with: { type: "json" } }),
+ globImport_json(`./${foo}.json`),
+ globImport_json2(`./${foo}.json`)
+);
+
+================================================================================
+TestImportSourceExternalESM
+---------- /out.js ----------
+// entry.js
+import source foo0 from "./foo.json";
+import source foo1 from "./foo.json" with { type: "json" };
+
+// import.source("./**/*.json") in entry.js
+var globImport_json = __glob({
+ "./foo.json": () => import.source("./foo.json")
+});
+
+// import.source("./**/*.json") in entry.js
+var globImport_json2 = __glob({
+ "./foo.json": () => import.source("./foo.json", { with: { type: "json" } })
+});
+
+// entry.js
+console.log(
+ foo0,
+ foo1,
+ import.source("./foo.json"),
+ import.source("./foo.json", { with: { type: "json" } }),
+ globImport_json(`./${foo}.json`),
+ globImport_json2(`./${foo}.json`)
+);
diff --git internal/compat/js_table.go internal/compat/js_table.go
index d47101f7cf5..8c2b6d9113b 100644
--- internal/compat/js_table.go
+++ internal/compat/js_table.go
@@ -92,7 +92,9 @@ const (
Hashbang
ImportAssertions
ImportAttributes
+ ImportDefer
ImportMeta
+ ImportSource
InlineScript
LogicalAssignment
NestedRestBinding
@@ -153,7 +155,9 @@ var StringToJSFeature = map[string]JSFeature{
"hashbang": Hashbang,
"import-assertions": ImportAssertions,
"import-attributes": ImportAttributes,
+ "import-defer": ImportDefer,
"import-meta": ImportMeta,
+ "import-source": ImportSource,
"inline-script": InlineScript,
"logical-assignment": LogicalAssignment,
"nested-rest-binding": NestedRestBinding,
@@ -597,6 +601,7 @@ var jsTable = map[JSFeature]map[Engine][]versionRange{
Opera: {{start: v{109, 0, 0}}},
Safari: {{start: v{17, 2, 0}}},
},
+ ImportDefer: {},
ImportMeta: {
Chrome: {{start: v{64, 0, 0}}},
Deno: {{start: v{1, 0, 0}}},
@@ -608,6 +613,7 @@ var jsTable = map[JSFeature]map[Engine][]versionRange{
Opera: {{start: v{51, 0, 0}}},
Safari: {{start: v{11, 1, 0}}},
},
+ ImportSource: {},
InlineScript: {},
LogicalAssignment: {
// Note: The latest version of "IE" failed 9 tests including: Logical Assignment: &&= basic support
diff --git internal/config/config.go internal/config/config.go
index ebcb170356c..5ea0390f96f 100644
--- internal/config/config.go
+++ internal/config/config.go
@@ -487,6 +487,11 @@ type Options struct {
AllowOverwrite bool
LegalComments LegalComments
+ LogPathStyle logger.PathStyle
+ CodePathStyle logger.PathStyle
+ MetafilePathStyle logger.PathStyle
+ SourcemapPathStyle logger.PathStyle
+
// If true, make sure to generate a single file that can be written to stdout
WriteToStdout bool
diff --git internal/js_ast/js_ast.go internal/js_ast/js_ast.go
index f8d3fe32f5b..768ff5b97d9 100644
--- internal/js_ast/js_ast.go
+++ internal/js_ast/js_ast.go
@@ -920,6 +920,7 @@ type EImportCall struct {
Expr Expr
OptionsOrNil Expr
CloseParenLoc logger.Loc
+ Phase ast.ImportPhase
}
type Stmt struct {
@@ -1627,10 +1628,12 @@ const (
ConstValueTrue
ConstValueFalse
ConstValueNumber
+ ConstValueString
)
type ConstValue struct {
- Number float64 // Use this for "ConstValueNumber"
+ Number float64 // Use this for "ConstValueNumber"
+ String []uint16 // Use this for "ConstValueString"
Kind ConstValueKind
}
@@ -1658,8 +1661,11 @@ func ExprToConstValue(expr Expr) ConstValue {
}
case *EString:
- // I'm deliberately not inlining strings here. It seems more likely that
- // people won't want them to be inlined since they can be arbitrarily long.
+ // Deliberately only inline small strings. We don't want to always
+ // inline all strings because they can be arbitrarily long.
+ if len(v.Value) <= 3 {
+ return ConstValue{Kind: ConstValueString, String: v.Value}
+ }
case *EBigInt:
// I'm deliberately not inlining bigints here for the same reason (they can
@@ -1685,6 +1691,9 @@ func ConstValueToExpr(loc logger.Loc, value ConstValue) Expr {
case ConstValueNumber:
return Expr{Loc: loc, Data: &ENumber{Value: value.Number}}
+
+ case ConstValueString:
+ return Expr{Loc: loc, Data: &EString{Value: value.String}}
}
panic("Internal error: invalid constant value")
diff --git internal/js_ast/js_ast_helpers.go internal/js_ast/js_ast_helpers.go
index 69c34a41b3d..bfd8cf4dfb2 100644
--- internal/js_ast/js_ast_helpers.go
+++ internal/js_ast/js_ast_helpers.go
@@ -1185,6 +1185,15 @@ func ShouldFoldBinaryOperatorWhenMinifying(binary *EBinary) bool {
return true
}
+ case BinOpMul:
+ // Allow multiplication of small-ish integers to be folded
+ // "1 * 2" => "3"
+ if left, right, ok := extractNumericValues(binary.Left, binary.Right); ok &&
+ left == math.Trunc(left) && math.Abs(left) <= 0xFF &&
+ right == math.Trunc(right) && math.Abs(right) <= 0xFF {
+ return true
+ }
+
case BinOpDiv:
// "0/0" => "NaN"
// "1/0" => "Infinity"
diff --git internal/js_lexer/js_lexer_test.go internal/js_lexer/js_lexer_test.go
index 47275ea85c6..fb0eea65fcc 100644
--- internal/js_lexer/js_lexer_test.go
+++ internal/js_lexer/js_lexer_test.go
@@ -32,7 +32,7 @@ func assertEqualStrings(t *testing.T, a string, b string) {
}
}
-func lexToken(t *testing.T, contents string) T {
+func lexToken(contents string) T {
log := logger.NewDeferLog(logger.DeferLogNoVerboseOrDebug, nil)
lexer := NewLexer(log, test.SourceForTest(contents), config.TSOptions{})
return lexer.Token
@@ -605,7 +605,7 @@ func TestTokens(t *testing.T) {
contents := it.contents
token := it.token
t.Run(contents, func(t *testing.T) {
- test.AssertEqual(t, lexToken(t, contents), token)
+ test.AssertEqual(t, lexToken(contents), token)
})
}
}
diff --git internal/js_parser/js_parser.go internal/js_parser/js_parser.go
index da11819d340..2c1652804ab 100644
--- internal/js_parser/js_parser.go
+++ internal/js_parser/js_parser.go
@@ -382,7 +382,6 @@ type parser struct {
shouldFoldTypeScriptConstantExpressions bool
allowIn bool
- allowPrivateIdentifiers bool
hasTopLevelReturn bool
latestReturnHadSemicolon bool
messageAboutThisIsUndefined bool
@@ -400,6 +399,7 @@ type globPatternImport struct {
approximateRange logger.Range
ref ast.Ref
kind ast.ImportKind
+ phase ast.ImportPhase
}
type namespaceImportItems struct {
@@ -477,6 +477,8 @@ type optionsThatSupportStructuralEquality struct {
mode config.Mode
platform config.Platform
outputFormat config.Format
+ logPathStyle logger.PathStyle
+ codePathStyle logger.PathStyle
asciiOnly bool
keepNames bool
minifySyntax bool
@@ -532,6 +534,8 @@ func OptionsFromConfig(options *config.Options) Options {
treeShaking: options.TreeShaking,
dropDebugger: options.DropDebugger,
mangleQuoted: options.MangleQuoted,
+ logPathStyle: options.LogPathStyle,
+ codePathStyle: options.CodePathStyle,
},
}
}
@@ -3113,8 +3117,9 @@ func (p *parser) parseFnExpr(loc logger.Loc, isAsync bool, asyncRange logger.Ran
}
type parenExprOpts struct {
- asyncRange logger.Range
- forceArrowFn bool
+ asyncRange logger.Range
+ forceArrowFn bool
+ isAfterQuestionAndBeforeColon bool
}
// This assumes that the open parenthesis has already been parsed by the caller
@@ -3203,7 +3208,8 @@ func (p *parser) parseParenExpr(loc logger.Loc, level js_ast.L, opts parenExprOp
p.fnOrArrowDataParse = oldFnOrArrowData
// Are these arguments to an arrow function?
- if p.lexer.Token == js_lexer.TEqualsGreaterThan || opts.forceArrowFn || (p.options.ts.Parse && p.lexer.Token == js_lexer.TColon) {
+ isArrowFn := p.lexer.Token == js_lexer.TEqualsGreaterThan
+ if isArrowFn || opts.forceArrowFn || (p.options.ts.Parse && p.lexer.Token == js_lexer.TColon) {
// Arrow functions are not allowed inside certain expressions
if level > js_ast.LAssign {
p.lexer.Unexpected()
@@ -3228,13 +3234,34 @@ func (p *parser) parseParenExpr(loc logger.Loc, level js_ast.L, opts parenExprOp
args = append(args, js_ast.Arg{Binding: binding, DefaultOrNil: initializerOrNil})
}
+ await := allowIdent
+ if isAsync {
+ await = allowExpr
+ }
+
// Avoid parsing TypeScript code like "a ? (1 + 2) : (3 + 4)" as an arrow
// function. The ":" after the ")" may be a return type annotation, so we
// attempt to convert the expressions to bindings first before deciding
// whether this is an arrow function, and only pick an arrow function if
// there were no conversion errors.
- if p.lexer.Token == js_lexer.TEqualsGreaterThan || (len(invalidLog.invalidTokens) == 0 &&
- p.trySkipTypeScriptArrowReturnTypeWithBacktracking()) || opts.forceArrowFn {
+ if p.options.ts.Parse && p.lexer.Token == js_lexer.TColon && len(invalidLog.invalidTokens) == 0 {
+ if opts.isAfterQuestionAndBeforeColon {
+ // Only do this very expensive check if we must
+ isArrowFn = p.isTypeScriptArrowReturnTypeAfterQuestionAndBeforeColon(await)
+ if isArrowFn {
+ // We know this will succeed because we've already done it once above
+ p.lexer.Next()
+ p.skipTypeScriptReturnType()
+ }
+ } else {
+ // Otherwise, do the less expensive check
+ isArrowFn = p.trySkipTypeScriptArrowReturnTypeWithBacktracking()
+ }
+ }
+
+ // Arrow function parsing may be forced if this parenthesized expression
+ // was prefixed by a TypeScript type parameter list such as "<T,>()"
+ if isArrowFn || opts.forceArrowFn {
if commaAfterSpread.Start != 0 {
p.log.AddError(&p.tracker, logger.Range{Loc: commaAfterSpread, Len: 1}, "Unexpected \",\" after rest pattern")
}
@@ -3255,11 +3282,6 @@ func (p *parser) parseParenExpr(loc logger.Loc, level js_ast.L, opts parenExprOp
p.markSyntaxFeature(entry.feature, entry.token)
}
- await := allowIdent
- if isAsync {
- await = allowExpr
- }
-
arrow := p.parseArrowBody(args, fnOrArrowDataParse{
needsAsyncLoc: loc,
await: await,
@@ -3443,6 +3465,7 @@ const (
exprFlagDecorator exprFlag = 1 << iota
exprFlagForLoopInit
exprFlagForAwaitLoopInit
+ exprFlagAfterQuestionAndBeforeColon
)
func (p *parser) parsePrefix(level js_ast.L, errors *deferredErrors, flags exprFlag) js_ast.Expr {
@@ -3489,7 +3512,9 @@ func (p *parser) parsePrefix(level js_ast.L, errors *deferredErrors, flags exprF
return value
}
- value := p.parseParenExpr(loc, level, parenExprOpts{})
+ value := p.parseParenExpr(loc, level, parenExprOpts{
+ isAfterQuestionAndBeforeColon: (flags & exprFlagAfterQuestionAndBeforeColon) != 0,
+ })
return value
case js_lexer.TFalse:
@@ -3512,7 +3537,7 @@ func (p *parser) parsePrefix(level js_ast.L, errors *deferredErrors, flags exprF
return js_ast.Expr{Loc: loc, Data: js_ast.EThisShared}
case js_lexer.TPrivateIdentifier:
- if !p.allowPrivateIdentifiers || !p.allowIn || level >= js_ast.LCompare {
+ if !p.allowIn || level >= js_ast.LCompare {
p.lexer.Unexpected()
}
@@ -4098,14 +4123,24 @@ func (p *parser) willNeedBindingPattern() bool {
// Note: The caller has already parsed the "import" keyword
func (p *parser) parseImportExpr(loc logger.Loc, level js_ast.L) js_ast.Expr {
// Parse an "import.meta" expression
+ phase := ast.EvaluationPhase
if p.lexer.Token == js_lexer.TDot {
p.lexer.Next()
- if !p.lexer.IsContextualKeyword("meta") {
- p.lexer.ExpectedString("\"meta\"")
+ if p.lexer.IsContextualKeyword("meta") {
+ p.esmImportMeta = logger.Range{Loc: loc, Len: p.lexer.Range().End() - loc.Start}
+ p.lexer.Next()
+ return js_ast.Expr{Loc: loc, Data: &js_ast.EImportMeta{RangeLen: p.esmImportMeta.Len}}
+ } else if p.lexer.IsContextualKeyword("defer") {
+ p.markSyntaxFeature(compat.ImportDefer, p.lexer.Range())
+ phase = ast.DeferPhase
+ p.lexer.Next()
+ } else if p.lexer.IsContextualKeyword("source") {
+ p.markSyntaxFeature(compat.ImportSource, p.lexer.Range())
+ phase = ast.SourcePhase
+ p.lexer.Next()
+ } else {
+ p.lexer.Unexpected()
}
- p.esmImportMeta = logger.Range{Loc: loc, Len: p.lexer.Range().End() - loc.Start}
- p.lexer.Next()
- return js_ast.Expr{Loc: loc, Data: &js_ast.EImportMeta{RangeLen: p.esmImportMeta.Len}}
}
if level > js_ast.LCall {
@@ -4145,6 +4180,7 @@ func (p *parser) parseImportExpr(loc logger.Loc, level js_ast.L) js_ast.Expr {
Expr: value,
OptionsOrNil: optionsOrNil,
CloseParenLoc: closeParenLoc,
+ Phase: phase,
}}
}
@@ -4226,7 +4262,7 @@ func (p *parser) parseSuffix(left js_ast.Expr, level js_ast.L, errors *deferredE
case js_lexer.TDot:
p.lexer.Next()
- if p.lexer.Token == js_lexer.TPrivateIdentifier && p.allowPrivateIdentifiers {
+ if p.lexer.Token == js_lexer.TPrivateIdentifier {
// "a.#b"
// "a?.b.#c"
if _, ok := left.Data.(*js_ast.ESuper); ok {
@@ -4336,7 +4372,7 @@ func (p *parser) parseSuffix(left js_ast.Expr, level js_ast.L, errors *deferredE
}}
default:
- if p.lexer.Token == js_lexer.TPrivateIdentifier && p.allowPrivateIdentifiers {
+ if p.lexer.Token == js_lexer.TPrivateIdentifier {
// "a?.#b"
name := p.lexer.Identifier
nameLoc := p.lexer.Loc()
@@ -4471,12 +4507,12 @@ func (p *parser) parseSuffix(left js_ast.Expr, level js_ast.L, errors *deferredE
oldAllowIn := p.allowIn
p.allowIn = true
- yes := p.parseExpr(js_ast.LComma)
+ yes := p.parseExprWithFlags(js_ast.LComma, exprFlagAfterQuestionAndBeforeColon)
p.allowIn = oldAllowIn
p.lexer.Expect(js_lexer.TColon)
- no := p.parseExpr(js_ast.LComma)
+ no := p.parseExprWithFlags(js_ast.LComma, flags&exprFlagAfterQuestionAndBeforeColon)
left = js_ast.Expr{Loc: left.Loc, Data: &js_ast.EIf{Test: left, Yes: yes, No: no}}
case js_lexer.TExclamation:
@@ -6413,9 +6449,7 @@ func (p *parser) parseClass(classKeyword logger.Range, name *ast.LocRef, classOp
// Allow "in" and private fields inside class bodies
oldAllowIn := p.allowIn
- oldAllowPrivateIdentifiers := p.allowPrivateIdentifiers
p.allowIn = true
- p.allowPrivateIdentifiers = true
// A scope is needed for private identifiers
scopeIndex := p.pushScopeForParsePass(js_ast.ScopeClassBody, bodyLoc)
@@ -6475,7 +6509,6 @@ func (p *parser) parseClass(classKeyword logger.Range, name *ast.LocRef, classOp
}
p.allowIn = oldAllowIn
- p.allowPrivateIdentifiers = oldAllowPrivateIdentifiers
closeBraceLoc := p.saveExprCommentsHere()
p.lexer.Expect(js_lexer.TCloseBrace)
@@ -7247,7 +7280,7 @@ func (p *parser) parseStmt(opts parseStmtOpts) js_ast.Stmt {
name := js_ast.GenerateNonUniqueNameFromPath(pathText) + "_star"
namespaceRef = p.storeNameInRef(js_lexer.MaybeSubstring{String: name})
}
- importRecordIndex := p.addImportRecord(ast.ImportStmt, pathRange, pathText, assertOrWith, flags)
+ importRecordIndex := p.addImportRecord(ast.ImportStmt, ast.EvaluationPhase, pathRange, pathText, assertOrWith, flags)
// Export-star statements anywhere in the file disable top-level const
// local prefix because import cycles can be used to trigger TDZ
@@ -7270,7 +7303,7 @@ func (p *parser) parseStmt(opts parseStmtOpts) js_ast.Stmt {
// "export {} from 'path'"
p.lexer.Next()
pathLoc, pathText, assertOrWith, flags := p.parsePath()
- importRecordIndex := p.addImportRecord(ast.ImportStmt, pathLoc, pathText, assertOrWith, flags)
+ importRecordIndex := p.addImportRecord(ast.ImportStmt, ast.EvaluationPhase, pathLoc, pathText, assertOrWith, flags)
name := "import_" + js_ast.GenerateNonUniqueNameFromPath(pathText)
namespaceRef := p.storeNameInRef(js_lexer.MaybeSubstring{String: name})
@@ -7818,6 +7851,7 @@ func (p *parser) parseStmt(opts parseStmtOpts) js_ast.Stmt {
p.esmImportStatementKeyword = p.lexer.Range()
p.lexer.Next()
stmt := js_ast.SImport{}
+ phase := ast.EvaluationPhase
wasOriginallyBareImport := false
// "export import foo = bar"
@@ -7881,9 +7915,54 @@ func (p *parser) parseStmt(opts parseStmtOpts) js_ast.Stmt {
}
defaultName := p.lexer.Identifier
- stmt.DefaultName = &ast.LocRef{Loc: p.lexer.Loc(), Ref: p.storeNameInRef(defaultName)}
+ defaultLoc := p.lexer.Loc()
+ isDeferName := p.lexer.Raw() == "defer"
+ isSourceName := p.lexer.Raw() == "source"
p.lexer.Next()
+ if isDeferName && p.lexer.Token == js_lexer.TAsterisk {
+ // "import defer * as foo from 'bar';"
+ p.markSyntaxFeature(compat.ImportDefer, js_lexer.RangeOfIdentifier(p.source, defaultLoc))
+ phase = ast.DeferPhase
+ p.lexer.Next()
+ p.lexer.ExpectContextualKeyword("as")
+ stmt.NamespaceRef = p.storeNameInRef(p.lexer.Identifier)
+ starLoc := p.lexer.Loc()
+ stmt.StarNameLoc = &starLoc
+ p.lexer.Expect(js_lexer.TIdentifier)
+ p.lexer.ExpectContextualKeyword("from")
+ break
+ }
+
+ if isSourceName && p.lexer.Token == js_lexer.TIdentifier {
+ if p.lexer.Raw() == "from" {
+ nameSubstring := p.lexer.Identifier
+ nameLoc := p.lexer.Loc()
+ p.lexer.Next()
+ if p.lexer.IsContextualKeyword("from") {
+ // "import source from from 'foo';"
+ p.markSyntaxFeature(compat.ImportSource, js_lexer.RangeOfIdentifier(p.source, defaultLoc))
+ phase = ast.SourcePhase
+ stmt.DefaultName = &ast.LocRef{Loc: nameLoc, Ref: p.storeNameInRef(nameSubstring)}
+ p.lexer.Next()
+ } else {
+ // "import source from 'foo';"
+ stmt.DefaultName = &ast.LocRef{Loc: defaultLoc, Ref: p.storeNameInRef(defaultName)}
+ }
+ break
+ }
+
+ // "import source foo from 'bar';"
+ p.markSyntaxFeature(compat.ImportSource, js_lexer.RangeOfIdentifier(p.source, defaultLoc))
+ phase = ast.SourcePhase
+ stmt.DefaultName = &ast.LocRef{Loc: p.lexer.Loc(), Ref: p.storeNameInRef(p.lexer.Identifier)}
+ p.lexer.Next()
+ p.lexer.ExpectContextualKeyword("from")
+ break
+ }
+
+ stmt.DefaultName = &ast.LocRef{Loc: defaultLoc, Ref: p.storeNameInRef(defaultName)}
+
if p.options.ts.Parse {
// Skip over type-only imports
if defaultName.String == "type" {
@@ -7997,7 +8076,7 @@ func (p *parser) parseStmt(opts parseStmtOpts) js_ast.Stmt {
if wasOriginallyBareImport {
flags |= ast.WasOriginallyBareImport
}
- stmt.ImportRecordIndex = p.addImportRecord(ast.ImportStmt, pathLoc, pathText, assertOrWith, flags)
+ stmt.ImportRecordIndex = p.addImportRecord(ast.ImportStmt, phase, pathLoc, pathText, assertOrWith, flags)
if stmt.StarNameLoc != nil {
name := p.loadNameFromRef(stmt.NamespaceRef)
@@ -8282,13 +8361,21 @@ func (p *parser) parseStmt(opts parseStmtOpts) js_ast.Stmt {
}
}
-func (p *parser) addImportRecord(kind ast.ImportKind, pathRange logger.Range, text string, assertOrWith *ast.ImportAssertOrWith, flags ast.ImportRecordFlags) uint32 {
+func (p *parser) addImportRecord(
+ kind ast.ImportKind,
+ phase ast.ImportPhase,
+ pathRange logger.Range,
+ text string,
+ assertOrWith *ast.ImportAssertOrWith,
+ flags ast.ImportRecordFlags,
+) uint32 {
index := uint32(len(p.importRecords))
p.importRecords = append(p.importRecords, ast.ImportRecord{
Kind: kind,
Range: pathRange,
Path: logger.Path{Text: text},
AssertOrWith: assertOrWith,
+ Phase: phase,
Flags: flags,
})
return index
@@ -8479,9 +8566,9 @@ func (p *parser) pushScopeForVisitPass(kind js_ast.ScopeKind, loc logger.Loc) {
// Sanity-check that the scopes generated by the first and second passes match
if order.loc != loc || order.scope.Kind != kind {
- panic(fmt.Sprintf("Expected scope (%d, %d) in %s, found scope (%d, %d)",
+ panic(fmt.Sprintf("Expected scope (%d, %d) in %q, found scope (%d, %d)",
kind, loc.Start,
- p.source.PrettyPath,
+ p.source.PrettyPaths.Select(p.options.logPathStyle),
order.scope.Kind, order.loc.Start))
}
@@ -12413,7 +12500,8 @@ func (p *parser) instantiateInjectDotName(loc logger.Loc, name injectedDotName,
p.log.AddErrorWithNotes(&p.tracker, r,
fmt.Sprintf("Cannot assign to %q because it's an import from an injected file", joined),
[]logger.MsgData{tracker.MsgData(js_lexer.RangeOfIdentifier(where.source, where.loc),
- fmt.Sprintf("The symbol %q was exported from %q here:", joined, where.source.PrettyPath))})
+ fmt.Sprintf("The symbol %q was exported from %q here:",
+ joined, where.source.PrettyPaths.Select(p.options.logPathStyle)))})
}
}
@@ -13363,7 +13451,8 @@ func (p *parser) visitExprInOut(expr js_ast.Expr, in exprIn) (js_ast.Expr, exprO
p.log.AddErrorWithNotes(&p.tracker, r,
fmt.Sprintf("Cannot assign to %q because it's an import from an injected file", name),
[]logger.MsgData{tracker.MsgData(js_lexer.RangeOfIdentifier(where.source, where.loc),
- fmt.Sprintf("The symbol %q was exported from %q here:", name, where.source.PrettyPath))})
+ fmt.Sprintf("The symbol %q was exported from %q here:",
+ name, where.source.PrettyPaths.Select(p.options.logPathStyle)))})
}
}
}
@@ -13668,7 +13757,7 @@ func (p *parser) visitExprInOut(expr js_ast.Expr, in exprIn) (js_ast.Expr, exprO
{
Kind: js_ast.PropertyField,
Key: js_ast.Expr{Loc: expr.Loc, Data: &js_ast.EString{Value: helpers.StringToUTF16("fileName")}},
- ValueOrNil: js_ast.Expr{Loc: expr.Loc, Data: &js_ast.EString{Value: helpers.StringToUTF16(p.source.PrettyPath)}},
+ ValueOrNil: js_ast.Expr{Loc: expr.Loc, Data: &js_ast.EString{Value: helpers.StringToUTF16(p.source.PrettyPaths.Select(p.options.codePathStyle))}},
},
{
Kind: js_ast.PropertyField,
@@ -14770,7 +14859,7 @@ func (p *parser) visitExprInOut(expr js_ast.Expr, in exprIn) (js_ast.Expr, exprO
return js_ast.Expr{Loc: arg.Loc, Data: js_ast.ENullShared}
}
- importRecordIndex := p.addImportRecord(ast.ImportDynamic, p.source.RangeOfString(arg.Loc), helpers.UTF16ToString(str.Value), assertOrWith, flags)
+ importRecordIndex := p.addImportRecord(ast.ImportDynamic, e.Phase, p.source.RangeOfString(arg.Loc), helpers.UTF16ToString(str.Value), assertOrWith, flags)
if isAwaitTarget && p.fnOrArrowDataVisit.tryBodyCount != 0 {
record := &p.importRecords[importRecordIndex]
record.Flags |= ast.HandlesImportErrors
@@ -14789,7 +14878,7 @@ func (p *parser) visitExprInOut(expr js_ast.Expr, in exprIn) (js_ast.Expr, exprO
// Handle glob patterns
if p.options.mode == config.ModeBundle {
- if value := p.handleGlobPattern(arg, ast.ImportDynamic, "globImport", assertOrWith); value.Data != nil {
+ if value := p.handleGlobPattern(arg, ast.ImportDynamic, e.Phase, "globImport", assertOrWith); value.Data != nil {
return value
}
}
@@ -15131,7 +15220,7 @@ func (p *parser) visitExprInOut(expr js_ast.Expr, in exprIn) (js_ast.Expr, exprO
return js_ast.Expr{Loc: expr.Loc, Data: js_ast.ENullShared}
}
- importRecordIndex := p.addImportRecord(ast.ImportRequireResolve, p.source.RangeOfString(e.Args[0].Loc), helpers.UTF16ToString(str.Value), nil, 0)
+ importRecordIndex := p.addImportRecord(ast.ImportRequireResolve, ast.EvaluationPhase, p.source.RangeOfString(e.Args[0].Loc), helpers.UTF16ToString(str.Value), nil, 0)
if p.fnOrArrowDataVisit.tryBodyCount != 0 {
record := &p.importRecords[importRecordIndex]
record.Flags |= ast.HandlesImportErrors
@@ -15327,7 +15416,7 @@ func (p *parser) visitExprInOut(expr js_ast.Expr, in exprIn) (js_ast.Expr, exprO
return js_ast.Expr{Loc: expr.Loc, Data: js_ast.ENullShared}
}
- importRecordIndex := p.addImportRecord(ast.ImportRequire, p.source.RangeOfString(arg.Loc), helpers.UTF16ToString(str.Value), nil, 0)
+ importRecordIndex := p.addImportRecord(ast.ImportRequire, ast.EvaluationPhase, p.source.RangeOfString(arg.Loc), helpers.UTF16ToString(str.Value), nil, 0)
if p.fnOrArrowDataVisit.tryBodyCount != 0 {
record := &p.importRecords[importRecordIndex]
record.Flags |= ast.HandlesImportErrors
@@ -15350,7 +15439,7 @@ func (p *parser) visitExprInOut(expr js_ast.Expr, in exprIn) (js_ast.Expr, exprO
// Handle glob patterns
if p.options.mode == config.ModeBundle {
- if value := p.handleGlobPattern(arg, ast.ImportRequire, "globRequire", nil); value.Data != nil {
+ if value := p.handleGlobPattern(arg, ast.ImportRequire, ast.EvaluationPhase, "globRequire", nil); value.Data != nil {
return value
}
}
@@ -16194,7 +16283,7 @@ func remapExprLocsInJSON(expr *js_ast.Expr, table []logger.StringInJSTableEntry)
}
}
-func (p *parser) handleGlobPattern(expr js_ast.Expr, kind ast.ImportKind, prefix string, assertOrWith *ast.ImportAssertOrWith) js_ast.Expr {
+func (p *parser) handleGlobPattern(expr js_ast.Expr, kind ast.ImportKind, phase ast.ImportPhase, prefix string, assertOrWith *ast.ImportAssertOrWith) js_ast.Expr {
pattern, approximateRange := p.globPatternFromExpr(expr)
if pattern == nil {
return js_ast.Expr{}
@@ -16242,8 +16331,8 @@ func (p *parser) handleGlobPattern(expr js_ast.Expr, kind ast.ImportKind, prefix
// Don't generate duplicate glob imports
outer:
for _, globPattern := range p.globPatternImports {
- // Check the kind
- if globPattern.kind != kind {
+ // Check the kind and phase
+ if globPattern.kind != kind || globPattern.phase != phase {
continue
}
@@ -16319,6 +16408,7 @@ outer:
approximateRange: approximateRange,
ref: ref,
kind: kind,
+ phase: phase,
})
}
@@ -18119,7 +18209,7 @@ func (p *parser) generateImportStmt(
p.moduleScope.Generated = append(p.moduleScope.Generated, namespaceRef)
declaredSymbols := make([]js_ast.DeclaredSymbol, 1+len(imports))
clauseItems := make([]js_ast.ClauseItem, len(imports))
- importRecordIndex := p.addImportRecord(ast.ImportStmt, pathRange, path, nil, 0)
+ importRecordIndex := p.addImportRecord(ast.ImportStmt, ast.EvaluationPhase, pathRange, path, nil, 0)
if sourceIndex != nil {
p.importRecords[importRecordIndex].SourceIndex = ast.MakeIndex32(*sourceIndex)
}
@@ -18207,6 +18297,7 @@ func (p *parser) toAST(before, parts, after []js_ast.Part, hashbang string, dire
before, importRecordIndex = p.generateImportStmt(helpers.GlobPatternToString(glob.parts), glob.approximateRange, []string{glob.name}, before, symbols, nil, nil)
record := &p.importRecords[importRecordIndex]
record.AssertOrWith = glob.assertOrWith
+ record.Phase = glob.phase
record.GlobPattern = &ast.GlobPattern{
Parts: glob.parts,
ExportAlias: glob.name,
diff --git internal/js_parser/js_parser_lower.go internal/js_parser/js_parser_lower.go
index fbd74416061..e39acbc1276 100644
--- internal/js_parser/js_parser_lower.go
+++ internal/js_parser/js_parser_lower.go
@@ -88,6 +88,16 @@ func (p *parser) markSyntaxFeature(feature compat.JSFeature, r logger.Range) (di
"Top-level await is not available in %s", where))
return
+ case compat.ImportDefer:
+ p.log.AddError(&p.tracker, r, fmt.Sprintf(
+ "Deferred imports are not available in %s", where))
+ return
+
+ case compat.ImportSource:
+ p.log.AddError(&p.tracker, r, fmt.Sprintf(
+ "Source phase imports are not available in %s", where))
+ return
+
case compat.Bigint:
// This can't be polyfilled
kind := logger.Warning
diff --git internal/js_parser/js_parser_test.go internal/js_parser/js_parser_test.go
index 3e7919aaf3d..34d27a22fb9 100644
--- internal/js_parser/js_parser_test.go
+++ internal/js_parser/js_parser_test.go
@@ -3113,6 +3113,30 @@ func TestImport(t *testing.T) {
// String import alias with "import * as"
expectParseError(t, "import * as '' from 'foo'", "<stdin>: ERROR: Expected identifier but found \"''\"\n")
+
+ // See: https://github.com/tc39/proposal-defer-import-eval
+ expectPrinted(t, "import defer from 'bar'", "import defer from \"bar\";\n")
+ expectPrinted(t, "import defer, { foo } from 'bar'", "import defer, { foo } from \"bar\";\n")
+ expectPrinted(t, "import defer * as foo from 'bar'", "import defer * as foo from \"bar\";\n")
+ expectPrinted(t, "import.defer('foo')", "import.defer(\"foo\");\n")
+ expectParseError(t, "import defer 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"'bar'\"\n")
+ expectParseError(t, "import defer foo from 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"foo\"\n")
+ expectParseError(t, "import defer { foo } from 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"{\"\n")
+ expectParseErrorTarget(t, 6, "import defer * as foo from 'bar'", "<stdin>: ERROR: Deferred imports are not available in the configured target environment\n")
+ expectParseErrorTarget(t, 6, "import.defer('foo')", "<stdin>: ERROR: Deferred imports are not available in the configured target environment\n")
+
+ // See: https://github.com/tc39/proposal-source-phase-imports
+ expectPrinted(t, "import source from 'bar'", "import source from \"bar\";\n")
+ expectPrinted(t, "import source, { foo } from 'bar'", "import source, { foo } from \"bar\";\n")
+ expectPrinted(t, "import source foo from 'bar'", "import source foo from \"bar\";\n")
+ expectPrinted(t, "import source from from 'bar'", "import source from from \"bar\";\n")
+ expectPrinted(t, "import source source from 'bar'", "import source source from \"bar\";\n")
+ expectPrinted(t, "import.source('foo')", "import.source(\"foo\");\n")
+ expectParseError(t, "import source 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"'bar'\"\n")
+ expectParseError(t, "import source * as foo from 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"*\"\n")
+ expectParseError(t, "import source { foo } from 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"{\"\n")
+ expectParseErrorTarget(t, 6, "import source foo from 'bar'", "<stdin>: ERROR: Source phase imports are not available in the configured target environment\n")
+ expectParseErrorTarget(t, 6, "import.source('foo')", "<stdin>: ERROR: Source phase imports are not available in the configured target environment\n")
}
func TestExport(t *testing.T) {
@@ -4823,7 +4847,7 @@ func TestMangleUnaryConstantFolding(t *testing.T) {
func TestMangleBinaryConstantFolding(t *testing.T) {
expectPrintedNormalAndMangle(t, "x = 3 + 6", "x = 3 + 6;\n", "x = 9;\n")
expectPrintedNormalAndMangle(t, "x = 3 - 6", "x = 3 - 6;\n", "x = -3;\n")
- expectPrintedNormalAndMangle(t, "x = 3 * 6", "x = 3 * 6;\n", "x = 3 * 6;\n")
+ expectPrintedNormalAndMangle(t, "x = 3 * 6", "x = 3 * 6;\n", "x = 18;\n")
expectPrintedNormalAndMangle(t, "x = 3 / 6", "x = 3 / 6;\n", "x = 3 / 6;\n")
expectPrintedNormalAndMangle(t, "x = 3 % 6", "x = 3 % 6;\n", "x = 3 % 6;\n")
expectPrintedNormalAndMangle(t, "x = 3 ** 6", "x = 3 ** 6;\n", "x = 3 ** 6;\n")
@@ -6045,10 +6069,10 @@ func TestPreserveOptionalChainParentheses(t *testing.T) {
}
func TestPrivateIdentifiers(t *testing.T) {
- expectParseError(t, "#foo", "<stdin>: ERROR: Unexpected \"#foo\"\n")
- expectParseError(t, "#foo in this", "<stdin>: ERROR: Unexpected \"#foo\"\n")
- expectParseError(t, "this.#foo", "<stdin>: ERROR: Expected identifier but found \"#foo\"\n")
- expectParseError(t, "this?.#foo", "<stdin>: ERROR: Expected identifier but found \"#foo\"\n")
+ expectParseError(t, "#foo", "<stdin>: ERROR: Expected \"in\" but found end of file\n")
+ expectParseError(t, "#foo in this", "<stdin>: ERROR: Private name \"#foo\" must be declared in an enclosing class\n")
+ expectParseError(t, "this.#foo", "<stdin>: ERROR: Private name \"#foo\" must be declared in an enclosing class\n")
+ expectParseError(t, "this?.#foo", "<stdin>: ERROR: Private name \"#foo\" must be declared in an enclosing class\n")
expectParseError(t, "({ #foo: 1 })", "<stdin>: ERROR: Expected identifier but found \"#foo\"\n")
expectParseError(t, "class Foo { x = { #foo: 1 } }", "<stdin>: ERROR: Expected identifier but found \"#foo\"\n")
expectParseError(t, "class Foo { x = #foo }", "<stdin>: ERROR: Expected \"in\" but found \"}\"\n")
diff --git internal/js_parser/ts_parser.go internal/js_parser/ts_parser.go
index 4b37be60189..36f5328f0d9 100644
--- internal/js_parser/ts_parser.go
+++ internal/js_parser/ts_parser.go
@@ -920,6 +920,84 @@ func (p *parser) trySkipTypeScriptArrowReturnTypeWithBacktracking() bool {
return true
}
+// This is a very specific function that determines whether a colon token is a
+// TypeScript arrow function return type in the case where the arrow function
+// is the middle expression of a JavaScript ternary operator (i.e. is between
+// the "?" and ":" tokens). It's separate from the other function above called
+// "trySkipTypeScriptArrowReturnTypeWithBacktracking" because it's much more
+// expensive, and likely not as robust.
+func (originalParser *parser) isTypeScriptArrowReturnTypeAfterQuestionAndBeforeColon(await awaitOrYield) bool {
+ // Implement "backtracking" by swallowing lexer errors on a temporary parser
+ defer func() {
+ r := recover()
+ if _, isLexerPanic := r.(js_lexer.LexerPanic); isLexerPanic {
+ return // Swallow this error
+ } else if r != nil {
+ panic(r)
+ }
+ }()
+
+ // THIS IS A GROSS HACK. Some context:
+ //
+ // JavaScript is designed to not require a backtracking parser. Generally a
+ // backtracking parser is not regarded as a good thing and you try to avoid
+ // having one if it's not necessary.
+ //
+ // However, TypeScript's parser does do backtracking in an (admittedly noble)
+ // effort to retrofit nice type syntax onto JavaScript. Up until this edge
+ // case was discovered, this backtracking was limited to type syntax so
+ // esbuild could deal with it by using a backtracking lexer without needing a
+ // backtracking parser.
+ //
+ // This edge case requires a backtracking parser. The TypeScript compiler's
+ // algorithm for parsing this is to try to parse the entire arrow function
+ // body and then reset all the way back to the colon for the arrow function
+ // return type if the token following the arrow function body is not another
+ // colon. For example:
+ //
+ // x = a ? (b) : c => d;
+ // y = a ? (b) : c => d : e;
+ //
+ // The first colon of "x" pairs with the "?" because the arrow function
+ // "(b) : c => d" is not followed by a colon. However, the first colon of "y"
+ // starts a return type because the arrow function "(b) : c => d" is followed
+ // by a colon. In other words, the first ":" before the arrow function body
+ // must pair with the "?" unless there is another ":" to pair with it after
+ // the function body.
+ //
+ // I'm not going to rewrite esbuild's parser to support backtracking for this
+ // one edge case. So instead, esbuild tries to parse the arrow function body
+ // using a rough copy of the parser and then always throws the result away.
+ // So arrow function bodies will always be parsed twice for this edge case.
+ //
+ // This is a hack instead of a good solution because the parser isn't designed
+ // for this, and doing this is not going to have good test coverage given that
+ // it's an edge case. We can't prevent parser code (either currently or in the
+ // future) from accidentally depending on some parser state that isn't cloned
+ // here. That could result in a parser panic when parsing a more complex
+ // version of this edge case.
+ p := newParser(logger.NewDeferLog(logger.DeferLogNoVerboseOrDebug, nil), originalParser.source, originalParser.lexer, &originalParser.options)
+
+ // Clone all state that the parser needs to parse this arrow function body
+ p.allowIn = originalParser.allowIn
+ p.lexer.IsLogDisabled = true
+ p.pushScopeForParsePass(js_ast.ScopeEntry, logger.Loc{Start: 0})
+ p.pushScopeForParsePass(js_ast.ScopeFunctionArgs, logger.Loc{Start: 1})
+
+ // Parse the return type
+ p.lexer.Expect(js_lexer.TColon)
+ p.skipTypeScriptReturnType()
+
+ // Parse the body and throw it out (with the side effect of maybe throwing an error)
+ _ = p.parseArrowBody([]js_ast.Arg{}, fnOrArrowDataParse{await: await})
+
+ // There must be a colon following the arrow function body to pair with the leading "?"
+ p.lexer.Expect(js_lexer.TColon)
+
+ // Parsing was successful if we get here
+ return true
+}
+
func (p *parser) trySkipTypeScriptArrowArgsWithBacktracking() bool {
oldLexer := p.lexer
p.lexer.IsLogDisabled = true
diff --git internal/js_parser/ts_parser_test.go internal/js_parser/ts_parser_test.go
index f2641dcf1e7..c58e646aedf 100644
--- internal/js_parser/ts_parser_test.go
+++ internal/js_parser/ts_parser_test.go
@@ -2375,6 +2375,32 @@ func TestTSArrow(t *testing.T) {
expectPrintedTS(t, "function f(async?) { g(async in x) }", "function f(async) {\n g(async in x);\n}\n")
expectPrintedTS(t, "function f(async?) { g(async as boolean) }", "function f(async) {\n g(async);\n}\n")
expectPrintedTS(t, "function f() { g(async as => boolean) }", "function f() {\n g(async (as) => boolean);\n}\n")
+
+ // https://github.com/evanw/esbuild/issues/4241
+ expectPrintedTS(t, "x = a ? (b = c) : d", "x = a ? b = c : d;\n")
+ expectPrintedTS(t, "x = a ? (b = c) : d => e", "x = a ? b = c : (d) => e;\n")
+ expectPrintedTS(t, "x = a ? (b = c) : T => d : (e = f)", "x = a ? (b = c) => d : e = f;\n")
+ expectPrintedTS(t, "x = a ? (b = c) : T => d : (e = f) : T => g", "x = a ? (b = c) => d : (e = f) => g;\n")
+ expectPrintedTS(t, "x = a ? b ? c : (d = e) : f => g", "x = a ? b ? c : d = e : (f) => g;\n")
+ expectPrintedTS(t, "x = a ? b ? (c = d) => e : (f = g) : h => i", "x = a ? b ? (c = d) => e : f = g : (h) => i;\n")
+ expectPrintedTS(t, "x = a ? b ? (c = d) : T => e : (f = g) : h => i", "x = a ? b ? (c = d) => e : f = g : (h) => i;\n")
+ expectPrintedTS(t, "x = a ? b ? (c = d) : T => e : (f = g) : (h = i) : T => j", "x = a ? b ? (c = d) => e : f = g : (h = i) => j;\n")
+ expectPrintedTS(t, "x = a ? (b) : T => c : d", "x = a ? (b) => c : d;\n")
+ expectPrintedTS(t, "x = a ? b - (c) : d => e", "x = a ? b - c : (d) => e;\n")
+ expectPrintedTS(t, "x = a ? b = (c) : T => d : e", "x = a ? b = (c) => d : e;\n")
+ expectParseErrorTS(t, "x = a ? (b = c) : T => d : (e = f) : g", "<stdin>: ERROR: Expected \";\" but found \":\"\n")
+ expectParseErrorTS(t, "x = a ? b ? (c = d) : T => e : (f = g)", "<stdin>: ERROR: Expected \":\" but found end of file\n")
+ expectParseErrorTS(t, "x = a ? - (b) : c => d : e", "<stdin>: ERROR: Expected \";\" but found \":\"\n")
+ expectParseErrorTS(t, "x = a ? b - (c) : d => e : f", "<stdin>: ERROR: Expected \";\" but found \":\"\n")
+
+ // Note: Newlines are important (they trigger backtracking)
+ expectPrintedTS(t, "x = (\n a ? (b = c) : { d: e }\n)", "x = a ? b = c : { d: e };\n")
+
+ // Need to clone "#private" identifier state in the parser
+ expectPrintedTS(t, "x = class { #y; y = a ? (b : T) : T => this.#y : c }", "x = class {\n #y;\n y = a ? (b) => this.#y : c;\n};\n")
+
+ // Need to clone "in" operator state in the parser
+ expectPrintedTS(t, "for (x = a ? () : T => b in c : d; ; ) ;", "for (x = a ? () => b in c : d; ; ) ;\n")
}
func TestTSSuperCall(t *testing.T) {
@@ -2940,6 +2966,22 @@ func TestTSTypeOnlyImport(t *testing.T) {
expectParseErrorTS(t, "import { x, type 'y' as 'z' } from 'mod'", "<stdin>: ERROR: Expected identifier but found \"'z'\"\n")
expectParseErrorTS(t, "import { x, type as 'y' } from 'mod'", "<stdin>: ERROR: Expected \"}\" but found \"'y'\"\n")
expectParseErrorTS(t, "import { x, type y as 'z' } from 'mod'", "<stdin>: ERROR: Expected identifier but found \"'z'\"\n")
+
+ // See: https://github.com/tc39/proposal-defer-import-eval
+ expectPrintedTS(t, "import defer * as foo from 'bar'", "")
+ expectPrintedTS(t, "import defer * as foo from 'bar'; let x: foo.Type", "let x;\n")
+ expectPrintedTS(t, "import defer * as foo from 'bar'; let x = foo.value", "import defer * as foo from \"bar\";\nlet x = foo.value;\n")
+ expectPrintedTS(t, "import type defer from 'bar'", "")
+ expectParseErrorTS(t, "import type defer * as foo from 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"*\"\n")
+
+ // See: https://github.com/tc39/proposal-source-phase-imports
+ expectPrintedTS(t, "import type source from 'bar'", "")
+ expectPrintedTS(t, "import source foo from 'bar'", "")
+ expectPrintedTS(t, "import source foo from 'bar'; let x: foo", "let x;\n")
+ expectPrintedTS(t, "import source foo from 'bar'; let x = foo", "import source foo from \"bar\";\nlet x = foo;\n")
+ expectPrintedTS(t, "import source type from 'bar'", "")
+ expectParseErrorTS(t, "import source type foo from 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"foo\"\n")
+ expectParseErrorTS(t, "import type source foo from 'bar'", "<stdin>: ERROR: Expected \"from\" but found \"foo\"\n")
}
func TestTSTypeOnlyExport(t *testing.T) {
diff --git internal/js_printer/js_printer.go internal/js_printer/js_printer.go
index b552fdc270b..2d780c581a9 100644
--- internal/js_printer/js_printer.go
+++ internal/js_printer/js_printer.go
@@ -1085,22 +1085,25 @@ func (p *printer) printProperty(property js_ast.Property) {
}
// Handle key syntax compression for cross-module constant inlining of enums
+ var keyFlags printExprFlags
if p.options.MinifySyntax && property.Flags.Has(js_ast.PropertyIsComputed) {
- if dot, ok := property.Key.Data.(*js_ast.EDot); ok {
- if value, ok := p.tryToGetImportedEnumValue(dot.Target, dot.Name); ok {
- if value.String != nil {
- property.Key.Data = &js_ast.EString{Value: value.String}
+ property.Key = p.lateConstantFoldUnaryOrBinaryOrIfExpr(property.Key)
+ keyFlags |= parentWasUnaryOrBinaryOrIfTest
- // Problematic key names must stay computed for correctness
- if !helpers.UTF16EqualsString(value.String, "__proto__") &&
- !helpers.UTF16EqualsString(value.String, "constructor") &&
- !helpers.UTF16EqualsString(value.String, "prototype") {
- property.Flags &= ^js_ast.PropertyIsComputed
- }
- } else {
- property.Key.Data = &js_ast.ENumber{Value: value.Number}
- property.Flags &= ^js_ast.PropertyIsComputed
- }
+ if key, ok := property.Key.Data.(*js_ast.EInlinedEnum); ok {
+ property.Key = key.Value
+ }
+
+ // Remove the computed flag if it's no longer needed
+ switch key := property.Key.Data.(type) {
+ case *js_ast.ENumber:
+ property.Flags &= ^js_ast.PropertyIsComputed
+
+ case *js_ast.EString:
+ if !helpers.UTF16EqualsString(key.Value, "__proto__") &&
+ !helpers.UTF16EqualsString(key.Value, "constructor") &&
+ !helpers.UTF16EqualsString(key.Value, "prototype") {
+ property.Flags &= ^js_ast.PropertyIsComputed
}
}
}
@@ -1167,7 +1170,7 @@ func (p *printer) printProperty(property js_ast.Property) {
p.options.Indent++
p.printIndent()
}
- p.printExpr(property.Key, js_ast.LComma, 0)
+ p.printExpr(property.Key, js_ast.LComma, keyFlags)
if isMultiLine {
p.printNewline()
p.printExprCommentsAfterCloseTokenAtLoc(property.CloseBracketLoc)
@@ -1311,7 +1314,7 @@ func (p *printer) printProperty(property js_ast.Property) {
}
default:
- p.printExpr(property.Key, js_ast.LLowest, 0)
+ p.printExpr(property.Key, js_ast.LLowest, keyFlags)
}
if fn, ok := property.ValueOrNil.Data.(*js_ast.EFunction); property.Kind.IsMethodDefinition() && ok {
@@ -1393,7 +1396,7 @@ func (p *printer) printQuotedUTF16(data []uint16, flags printQuotedFlags) {
p.print(c)
}
-func (p *printer) printRequireOrImportExpr(importRecordIndex uint32, level js_ast.L, flags printExprFlags, closeParenLoc logger.Loc) {
+func (p *printer) printRequireOrImportExpr(importRecordIndex uint32, level js_ast.L, flags printExprFlags, closeParenLoc logger.Loc, phase ast.ImportPhase) {
record := &p.importRecords[importRecordIndex]
if level >= js_ast.LNew || (flags&forbidCall) != 0 {
@@ -1457,7 +1460,14 @@ func (p *printer) printRequireOrImportExpr(importRecordIndex uint32, level js_as
kind := ast.ImportDynamic
if !p.options.UnsupportedFeatures.Has(compat.DynamicImport) {
p.printSpaceBeforeIdentifier()
- p.print("import(")
+ switch phase {
+ case ast.DeferPhase:
+ p.print("import.defer(")
+ case ast.SourcePhase:
+ p.print("import.source(")
+ default:
+ p.print("import(")
+ }
} else {
kind = ast.ImportRequire
p.printSpaceBeforeIdentifier()
@@ -2438,7 +2448,7 @@ func (p *printer) printExpr(expr js_ast.Expr, level js_ast.L, flags printExprFla
case *js_ast.ERequireString:
p.addSourceMapping(expr.Loc)
- p.printRequireOrImportExpr(e.ImportRecordIndex, level, flags, e.CloseParenLoc)
+ p.printRequireOrImportExpr(e.ImportRecordIndex, level, flags, e.CloseParenLoc, ast.EvaluationPhase)
case *js_ast.ERequireResolveString:
recordLoc := p.importRecords[e.ImportRecordIndex].Range.Loc
@@ -2473,7 +2483,7 @@ func (p *printer) printExpr(expr js_ast.Expr, level js_ast.L, flags printExprFla
case *js_ast.EImportString:
p.addSourceMapping(expr.Loc)
- p.printRequireOrImportExpr(e.ImportRecordIndex, level, flags, e.CloseParenLoc)
+ p.printRequireOrImportExpr(e.ImportRecordIndex, level, flags, e.CloseParenLoc, p.importRecords[e.ImportRecordIndex].Phase)
case *js_ast.EImportCall:
// Only print the second argument if either import assertions or import attributes are supported
@@ -2488,7 +2498,14 @@ func (p *printer) printExpr(expr js_ast.Expr, level js_ast.L, flags printExprFla
}
p.printSpaceBeforeIdentifier()
p.addSourceMapping(expr.Loc)
- p.print("import(")
+ switch e.Phase {
+ case ast.DeferPhase:
+ p.print("import.defer(")
+ case ast.SourcePhase:
+ p.print("import.source(")
+ default:
+ p.print("import(")
+ }
if isMultiLine {
p.printNewline()
p.options.Indent++
@@ -4688,7 +4705,14 @@ func (p *printer) printStmt(stmt js_ast.Stmt, flags printStmtFlags) {
p.addSourceMapping(stmt.Loc)
p.printIndent()
p.printSpaceBeforeIdentifier()
- p.print("import")
+ switch p.importRecords[s.ImportRecordIndex].Phase {
+ case ast.DeferPhase:
+ p.print("import defer")
+ case ast.SourcePhase:
+ p.print("import source")
+ default:
+ p.print("import")
+ }
p.printSpace()
if s.DefaultName != nil {
diff --git internal/js_printer/js_printer_test.go internal/js_printer/js_printer_test.go
index 13c20c88fd6..0358a6fd0c3 100644
--- internal/js_printer/js_printer_test.go
+++ internal/js_printer/js_printer_test.go
@@ -742,6 +742,12 @@ func TestImport(t *testing.T) {
expectPrinted(t, "import(/* webpackFoo: 1 */ 'path', { type: 'module' } /* webpackBar:2 */ );", "import(\n /* webpackFoo: 1 */\n \"path\",\n { type: \"module\" }\n /* webpackBar:2 */\n);\n")
expectPrinted(t, "import(new URL('path', /* webpackFoo: these can go anywhere */ import.meta.url))",
"import(new URL(\n \"path\",\n /* webpackFoo: these can go anywhere */\n import.meta.url\n));\n")
+
+ // See: https://github.com/tc39/proposal-defer-import-eval
+ expectPrintedMinify(t, "import defer * as foo from 'bar'", "import defer*as foo from\"bar\";")
+
+ // See: https://github.com/tc39/proposal-source-phase-imports
+ expectPrintedMinify(t, "import source foo from 'bar'", "import source foo from\"bar\";")
}
func TestExportDefault(t *testing.T) {
diff --git internal/linker/debug.go internal/linker/debug.go
index 04d0a394236..252733c3da6 100644
--- internal/linker/debug.go
+++ internal/linker/debug.go
@@ -81,7 +81,7 @@ func (c *linkerContext) generateExtraDataForFileJS(sourceIndex uint32) string {
} else {
sb.WriteByte(',')
}
- path := c.graph.Files[record.SourceIndex.GetIndex()].InputFile.Source.PrettyPath
+ path := c.graph.Files[record.SourceIndex.GetIndex()].InputFile.Source.PrettyPaths.Rel
sb.WriteString(fmt.Sprintf(`{"source":%s}`, helpers.QuoteForJSON(path, c.options.ASCIIOnly)))
}
sb.WriteByte(']')
@@ -122,7 +122,7 @@ func (c *linkerContext) generateExtraDataForFileJS(sourceIndex uint32) string {
sb.WriteByte(',')
}
sb.WriteString(fmt.Sprintf(`{"source":%s,"partIndex":%d}`,
- helpers.QuoteForJSON(c.graph.Files[dep.SourceIndex].InputFile.Source.PrettyPath, c.options.ASCIIOnly),
+ helpers.QuoteForJSON(c.graph.Files[dep.SourceIndex].InputFile.Source.PrettyPaths.Rel, c.options.ASCIIOnly),
dep.PartIndex,
))
}
diff --git internal/linker/linker.go internal/linker/linker.go
index 46070334572..206d1637041 100644
--- internal/linker/linker.go
+++ internal/linker/linker.go
@@ -757,7 +757,8 @@ func (c *linkerContext) generateChunksInParallel(additionalFiles []graph.OutputF
if c.options.NeedsMetafile {
jsonMetadataChunkPieces := c.breakJoinerIntoPieces(chunk.jsonMetadataChunkCallback(len(outputContents)))
jsonMetadataChunkBytes, _ := c.substituteFinalPaths(jsonMetadataChunkPieces, func(finalRelPathForImport string) string {
- return resolver.PrettyPath(c.fs, logger.Path{Text: c.fs.Join(c.options.AbsOutputDir, finalRelPathForImport), Namespace: "file"})
+ prettyPaths := resolver.MakePrettyPaths(c.fs, logger.Path{Text: c.fs.Join(c.options.AbsOutputDir, finalRelPathForImport), Namespace: "file"})
+ return prettyPaths.Select(c.options.MetafilePathStyle)
})
jsonMetadataChunk = string(jsonMetadataChunkBytes.Done())
}
@@ -1339,7 +1340,8 @@ func (c *linkerContext) scanImportsAndExports() {
if global, ok := otherRepr.AST.GlobalScope[name.Alias]; ok {
var hint string
if otherFile.InputFile.Loader == config.LoaderCSS {
- hint = fmt.Sprintf("Use the \"local-css\" loader for %q to enable local names.", otherFile.InputFile.Source.PrettyPath)
+ hint = fmt.Sprintf("Use the \"local-css\" loader for %q to enable local names.",
+ otherFile.InputFile.Source.PrettyPaths.Select(c.options.LogPathStyle))
} else {
hint = fmt.Sprintf("Use the \":local\" selector to change %q into a local name.", name.Alias)
}
@@ -1356,8 +1358,8 @@ func (c *linkerContext) scanImportsAndExports() {
} else {
c.log.AddError(file.LineColumnTracker(),
css_lexer.RangeOfIdentifier(file.InputFile.Source, name.AliasLoc),
- fmt.Sprintf("The name %q never appears in %q",
- name.Alias, otherFile.InputFile.Source.PrettyPath))
+ fmt.Sprintf("The name %q never appears in %q", name.Alias,
+ otherFile.InputFile.Source.PrettyPaths.Select(c.options.LogPathStyle)))
}
}
}
@@ -1605,12 +1607,15 @@ func (c *linkerContext) scanImportsAndExports() {
otherTracker := logger.MakeLineColumnTracker(&otherFile.Source)
ambiguousTracker := logger.MakeLineColumnTracker(&ambiguousFile.Source)
c.log.AddIDWithNotes(logger.MsgID_Bundler_AmbiguousReexport, logger.Debug, nil, logger.Range{},
- fmt.Sprintf("Re-export of %q in %q is ambiguous and has been removed", alias, file.Source.PrettyPath),
+ fmt.Sprintf("Re-export of %q in %q is ambiguous and has been removed", alias,
+ file.Source.PrettyPaths.Select(c.options.LogPathStyle)),
[]logger.MsgData{
otherTracker.MsgData(js_lexer.RangeOfIdentifier(otherFile.Source, mainLoc),
- fmt.Sprintf("One definition of %q comes from %q here:", alias, otherFile.Source.PrettyPath)),
+ fmt.Sprintf("One definition of %q comes from %q here:", alias,
+ otherFile.Source.PrettyPaths.Select(c.options.LogPathStyle))),
ambiguousTracker.MsgData(js_lexer.RangeOfIdentifier(ambiguousFile.Source, ambiguousLoc),
- fmt.Sprintf("Another definition of %q comes from %q here:", alias, ambiguousFile.Source.PrettyPath)),
+ fmt.Sprintf("Another definition of %q comes from %q here:", alias,
+ ambiguousFile.Source.PrettyPaths.Select(c.options.LogPathStyle))),
},
)
continue nextAlias
@@ -2685,8 +2690,8 @@ loop:
c.log.AddID(logger.MsgID_Bundler_ImportIsUndefined, kind,
trackerFile.LineColumnTracker(),
js_lexer.RangeOfIdentifier(trackerFile.InputFile.Source, namedImport.AliasLoc),
- fmt.Sprintf("Import %q will always be undefined because the file %q has no exports",
- namedImport.Alias, c.graph.Files[nextTracker.sourceIndex].InputFile.Source.PrettyPath))
+ fmt.Sprintf("Import %q will always be undefined because the file %q has no exports", namedImport.Alias,
+ c.graph.Files[nextTracker.sourceIndex].InputFile.Source.PrettyPaths.Select(c.options.LogPathStyle)))
}
case importDynamicFallback:
@@ -2746,7 +2751,7 @@ loop:
Kind: logger.Warning,
Data: trackerFile.LineColumnTracker().MsgData(r, fmt.Sprintf(
"Import %q will always be undefined because there is no matching export in %q",
- namedImport.Alias, nextFile.Source.PrettyPath)),
+ namedImport.Alias, nextFile.Source.PrettyPaths.Select(c.options.LogPathStyle))),
}
if helpers.IsInsideNodeModules(trackerFile.InputFile.Source.KeyPath.Text) {
msg.Kind = logger.Debug
@@ -2760,7 +2765,7 @@ loop:
Kind: logger.Error,
Data: trackerFile.LineColumnTracker().MsgData(r, fmt.Sprintf(
"No matching export in %q for import %q",
- nextFile.Source.PrettyPath, namedImport.Alias)),
+ nextFile.Source.PrettyPaths.Select(c.options.LogPathStyle), namedImport.Alias)),
}
c.maybeCorrectObviousTypo(nextFile.Repr.(*graph.JSRepr), namedImport.Alias, &msg)
c.log.AddMsg(msg)
@@ -4816,7 +4821,7 @@ func (c *linkerContext) generateCodeForFileInChunkJS(
}
cjsArgs = []js_ast.Expr{{Data: &js_ast.EObject{Properties: []js_ast.Property{{
Kind: kind,
- Key: js_ast.Expr{Data: &js_ast.EString{Value: helpers.StringToUTF16(file.InputFile.Source.PrettyPath)}},
+ Key: js_ast.Expr{Data: &js_ast.EString{Value: helpers.StringToUTF16(file.InputFile.Source.PrettyPaths.Select(c.options.CodePathStyle))}},
ValueOrNil: js_ast.Expr{Data: &js_ast.EFunction{Fn: js_ast.Fn{Args: args, Body: js_ast.FnBody{Block: js_ast.SBlock{Stmts: stmts}}}}},
}}}}}
} else if c.options.UnsupportedJSFeatures.Has(compat.Arrow) {
@@ -4888,7 +4893,7 @@ func (c *linkerContext) generateCodeForFileInChunkJS(
}
esmArgs = []js_ast.Expr{{Data: &js_ast.EObject{Properties: []js_ast.Property{{
Kind: kind,
- Key: js_ast.Expr{Data: &js_ast.EString{Value: helpers.StringToUTF16(file.InputFile.Source.PrettyPath)}},
+ Key: js_ast.Expr{Data: &js_ast.EString{Value: helpers.StringToUTF16(file.InputFile.Source.PrettyPaths.Select(c.options.CodePathStyle))}},
ValueOrNil: js_ast.Expr{Data: &js_ast.EFunction{Fn: js_ast.Fn{Body: js_ast.FnBody{Block: js_ast.SBlock{Stmts: stmts}}, IsAsync: isAsync}}},
}}}}}
} else if c.options.UnsupportedJSFeatures.Has(compat.Arrow) {
@@ -5797,7 +5802,7 @@ func (c *linkerContext) generateChunkJS(chunkIndex int, chunkWaitGroup *sync.Wai
}
jMeta.AddString("],\n")
if chunk.isEntryPoint {
- entryPoint := c.graph.Files[chunk.sourceIndex].InputFile.Source.PrettyPath
+ entryPoint := c.graph.Files[chunk.sourceIndex].InputFile.Source.PrettyPaths.Select(c.options.MetafilePathStyle)
jMeta.AddString(fmt.Sprintf(" \"entryPoint\": %s,\n", helpers.QuoteForJSON(entryPoint, c.options.ASCIIOnly)))
}
if chunkRepr.hasCSSChunk {
@@ -5832,7 +5837,7 @@ func (c *linkerContext) generateChunkJS(chunkIndex int, chunkWaitGroup *sync.Wai
j.AddString("\n")
}
- path := c.graph.Files[compileResult.sourceIndex].InputFile.Source.PrettyPath
+ path := c.graph.Files[compileResult.sourceIndex].InputFile.Source.PrettyPaths.Select(c.options.CodePathStyle)
// Make sure newlines in the path can't cause a syntax error. This does
// not minimize allocations because it's expected that this case never
@@ -5965,7 +5970,7 @@ func (c *linkerContext) generateChunkJS(chunkIndex int, chunkWaitGroup *sync.Wai
count += c.accurateFinalByteCount(output, finalRelDir)
}
jMeta.AddString(fmt.Sprintf("\n %s: {\n \"bytesInOutput\": %d\n %s}",
- helpers.QuoteForJSON(c.graph.Files[sourceIndex].InputFile.Source.PrettyPath, c.options.ASCIIOnly),
+ helpers.QuoteForJSON(c.graph.Files[sourceIndex].InputFile.Source.PrettyPaths.Select(c.options.MetafilePathStyle), c.options.ASCIIOnly),
count, c.generateExtraDataForFileJS(sourceIndex)))
}
if len(metaOrder) > 0 {
@@ -6327,7 +6332,7 @@ func (c *linkerContext) generateChunkCSS(chunkIndex int, chunkWaitGroup *sync.Wa
// and there is already an output file for the JavaScript entry point.
if _, ok := file.InputFile.Repr.(*graph.CSSRepr); ok {
jMeta.AddString(fmt.Sprintf("],\n \"entryPoint\": %s,\n \"inputs\": {",
- helpers.QuoteForJSON(file.InputFile.Source.PrettyPath, c.options.ASCIIOnly)))
+ helpers.QuoteForJSON(file.InputFile.Source.PrettyPaths.Select(c.options.MetafilePathStyle), c.options.ASCIIOnly)))
} else {
jMeta.AddString("],\n \"inputs\": {")
}
@@ -6352,7 +6357,8 @@ func (c *linkerContext) generateChunkCSS(chunkIndex int, chunkWaitGroup *sync.Wa
if newlineBeforeComment {
newline = "\n"
}
- comment := fmt.Sprintf("%s/* %s */\n", newline, c.graph.Files[compileResult.sourceIndex.GetIndex()].InputFile.Source.PrettyPath)
+ comment := fmt.Sprintf("%s/* %s */\n", newline,
+ c.graph.Files[compileResult.sourceIndex.GetIndex()].InputFile.Source.PrettyPaths.Select(c.options.CodePathStyle))
prevOffset.AdvanceString(comment)
j.AddString(comment)
}
@@ -6435,7 +6441,7 @@ func (c *linkerContext) generateChunkCSS(chunkIndex int, chunkWaitGroup *sync.Wa
jMeta.AddString(",")
}
jMeta.AddString(fmt.Sprintf("\n %s: {\n \"bytesInOutput\": %d\n }",
- helpers.QuoteForJSON(c.graph.Files[compileResult.sourceIndex.GetIndex()].InputFile.Source.PrettyPath, c.options.ASCIIOnly),
+ helpers.QuoteForJSON(c.graph.Files[compileResult.sourceIndex.GetIndex()].InputFile.Source.PrettyPaths.Select(c.options.MetafilePathStyle), c.options.ASCIIOnly),
c.accurateFinalByteCount(pieces[i], finalRelDir)))
}
if len(compileResults) > 0 {
@@ -6809,7 +6815,7 @@ func (c *linkerContext) generateIsolatedHash(chunk *chunkInfo, channel chan []by
if file.InputFile.Source.KeyPath.Namespace == "file" {
// Use the pretty path as the file name since it should be platform-
// independent (relative paths and the "/" path separator)
- filePath = file.InputFile.Source.PrettyPath
+ filePath = file.InputFile.Source.PrettyPaths.Rel
} else {
// If this isn't in the "file" namespace, just use the full path text
// verbatim. This could be a source of cross-platform differences if
@@ -7235,7 +7241,8 @@ func (c *linkerContext) recoverInternalError(waitGroup *sync.WaitGroup, sourceIn
if r := recover(); r != nil {
text := fmt.Sprintf("panic: %v", r)
if sourceIndex != runtime.SourceIndex {
- text = fmt.Sprintf("%s (while printing %q)", text, c.graph.Files[sourceIndex].InputFile.Source.PrettyPath)
+ text = fmt.Sprintf("%s (while printing %q)", text,
+ c.graph.Files[sourceIndex].InputFile.Source.PrettyPaths.Select(c.options.LogPathStyle))
}
c.log.AddErrorWithNotes(nil, logger.Range{}, text,
[]logger.MsgData{{Text: helpers.PrettyPrintedStack()}})
diff --git internal/logger/logger.go internal/logger/logger.go
index 8acb9048add..0c157a5bdae 100644
--- internal/logger/logger.go
+++ internal/logger/logger.go
@@ -163,7 +163,7 @@ type MsgData struct {
}
type MsgLocation struct {
- File string
+ File PrettyPaths
Namespace string
LineText string
Suggestion string
@@ -221,7 +221,7 @@ func (a SortableMsgs) Less(i int, j int) bool {
return aiLoc == nil && ajLoc != nil
}
if aiLoc.File != ajLoc.File {
- return aiLoc.File < ajLoc.File
+ return aiLoc.File.Abs < ajLoc.File.Abs || (aiLoc.File.Abs == ajLoc.File.Abs && aiLoc.File.Rel < ajLoc.File.Rel)
}
if aiLoc.Line != ajLoc.Line {
return aiLoc.Line < ajLoc.Line
@@ -403,15 +403,41 @@ func PlatformIndependentPathDirBaseExt(path string) (dir string, base string, ex
return
}
+type PrettyPaths struct {
+ // This option exists to help people that run esbuild in many different
+ // directories and want a unified way of reporting file paths. It avoids
+ // needing to code to convert from relative paths back to absolute paths
+ // to find the original file. It means builds are not reproducible across
+ // machines, however.
+ Abs string
+
+ // This is a mostly platform-independent path. It's relative to the current
+ // working directory and always uses standard path separators. This is the
+ // default behavior since it leads to reproducible builds across machines.
+ //
+ // Note that these paths still u,se the original case of the path, so they may
+ // still work differently on file systems that are case-insensitive vs.
+ // case-sensitive.
+ Rel string
+}
+
+type PathStyle uint8
+
+const (
+ RelPath PathStyle = iota
+ AbsPath
+)
+
+func (paths *PrettyPaths) Select(style PathStyle) string {
+ if style == AbsPath {
+ return paths.Abs
+ }
+ return paths.Rel
+}
+
type Source struct {
// This is used for error messages and the metadata JSON file.
- //
- // This is a mostly platform-independent path. It's relative to the current
- // working directory and always uses standard path separators. Use this for
- // referencing a file in all output data. These paths still use the original
- // case of the path so they may still work differently on file systems that
- // are case-insensitive vs. case-sensitive.
- PrettyPath string
+ PrettyPaths PrettyPaths
// An identifier that is mixed in to automatically-generated symbol names to
// improve readability. For example, if the identifier is "util" then the
@@ -1176,12 +1202,13 @@ type OutputOptions struct {
IncludeSource bool
Color UseColor
LogLevel LogLevel
+ PathStyle PathStyle
Overrides map[MsgID]LogLevel
}
func (msg Msg) String(options OutputOptions, terminalInfo TerminalInfo) string {
// Format the message
- text := msgString(options.IncludeSource, terminalInfo, msg.ID, msg.Kind, msg.Data, msg.PluginName)
+ text := msgString(options.IncludeSource, options.PathStyle, terminalInfo, msg.ID, msg.Kind, msg.Data, msg.PluginName)
// Format the notes
var oldData MsgData
@@ -1189,7 +1216,7 @@ func (msg Msg) String(options OutputOptions, terminalInfo TerminalInfo) string {
if options.IncludeSource && (i == 0 || strings.IndexByte(oldData.Text, '\n') >= 0 || oldData.Location != nil) {
text += "\n"
}
- text += msgString(options.IncludeSource, terminalInfo, MsgID_None, Note, note, "")
+ text += msgString(options.IncludeSource, options.PathStyle, terminalInfo, MsgID_None, Note, note, "")
oldData = note
}
@@ -1216,10 +1243,10 @@ func emptyMarginText(maxMargin int, isLast bool) string {
return fmt.Sprintf(" %s │ ", space)
}
-func msgString(includeSource bool, terminalInfo TerminalInfo, id MsgID, kind MsgKind, data MsgData, pluginName string) string {
+func msgString(includeSource bool, pathStyle PathStyle, terminalInfo TerminalInfo, id MsgID, kind MsgKind, data MsgData, pluginName string) string {
if !includeSource {
if loc := data.Location; loc != nil {
- return fmt.Sprintf("%s: %s: %s\n", loc.File, kind.String(), data.Text)
+ return fmt.Sprintf("%s: %s: %s\n", loc.File.Select(pathStyle), kind.String(), data.Text)
}
return fmt.Sprintf("%s: %s\n", kind.String(), data.Text)
}
@@ -1237,7 +1264,7 @@ func msgString(includeSource bool, terminalInfo TerminalInfo, id MsgID, kind Msg
if data.Location != nil {
maxMargin := len(fmt.Sprintf("%d", data.Location.Line))
- d := detailStruct(data, terminalInfo, maxMargin)
+ d := detailStruct(data, pathStyle, terminalInfo, maxMargin)
if d.Suggestion != "" {
location = fmt.Sprintf("\n %s:%d:%d:\n%s%s%s%s%s%s\n%s%s%s%s%s\n%s%s%s%s%s\n%s",
@@ -1462,7 +1489,7 @@ type MsgDetail struct {
// the most important.
type LineColumnTracker struct {
contents string
- prettyPath string
+ prettyPaths PrettyPaths
offset int32
line int32
lineStart int32
@@ -1481,7 +1508,7 @@ func MakeLineColumnTracker(source *Source) LineColumnTracker {
return LineColumnTracker{
contents: source.Contents,
- prettyPath: source.PrettyPath,
+ prettyPaths: source.PrettyPaths,
hasLineStart: true,
hasSource: true,
}
@@ -1603,7 +1630,7 @@ func (tracker *LineColumnTracker) MsgLocationOrNil(r Range) *MsgLocation {
lineCount, columnCount, lineStart, lineEnd := tracker.computeLineAndColumn(int(r.Loc.Start))
return &MsgLocation{
- File: tracker.prettyPath,
+ File: tracker.prettyPaths,
Line: lineCount + 1, // 0-based to 1-based
Column: columnCount,
Length: int(r.Len),
@@ -1611,7 +1638,7 @@ func (tracker *LineColumnTracker) MsgLocationOrNil(r Range) *MsgLocation {
}
}
-func detailStruct(data MsgData, terminalInfo TerminalInfo, maxMargin int) MsgDetail {
+func detailStruct(data MsgData, pathStyle PathStyle, terminalInfo TerminalInfo, maxMargin int) MsgDetail {
// Only highlight the first line of the line text
loc := *data.Location
endOfFirstLine := len(loc.LineText)
@@ -1741,7 +1768,7 @@ func detailStruct(data MsgData, terminalInfo TerminalInfo, maxMargin int) MsgDet
margin := marginWithLineText(maxMargin, loc.Line)
return MsgDetail{
- Path: loc.File,
+ Path: loc.File.Select(pathStyle),
Line: loc.Line,
Column: loc.Column,
diff --git internal/resolver/package_json.go internal/resolver/package_json.go
index 068acdea0ef..01ad7d5fb83 100644
--- internal/resolver/package_json.go
+++ internal/resolver/package_json.go
@@ -255,9 +255,9 @@ func (r resolverQuery) parsePackageJSON(inputPath string) *packageJSON {
r.debugLogs.addNote(fmt.Sprintf("Failed to read file %q: %s", packageJSONPath, originalError.Error()))
}
if err != nil {
- r.log.AddError(nil, logger.Range{},
- fmt.Sprintf("Cannot read file %q: %s",
- PrettyPath(r.fs, logger.Path{Text: packageJSONPath, Namespace: "file"}), err.Error()))
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: packageJSONPath, Namespace: "file"})
+ r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot read file %q: %s",
+ prettyPaths.Select(r.options.LogPathStyle), err.Error()))
return nil
}
if r.debugLogs != nil {
@@ -266,9 +266,9 @@ func (r resolverQuery) parsePackageJSON(inputPath string) *packageJSON {
keyPath := logger.Path{Text: packageJSONPath, Namespace: "file"}
jsonSource := logger.Source{
- KeyPath: keyPath,
- PrettyPath: PrettyPath(r.fs, keyPath),
- Contents: contents,
+ KeyPath: keyPath,
+ PrettyPaths: MakePrettyPaths(r.fs, keyPath),
+ Contents: contents,
}
tracker := logger.MakeLineColumnTracker(&jsonSource)
diff --git internal/resolver/resolver.go internal/resolver/resolver.go
index bd34c50c37e..8f13043e938 100644
--- internal/resolver/resolver.go
+++ internal/resolver/resolver.go
@@ -334,19 +334,21 @@ func NewResolver(call config.APICall, fs fs.FS, log logger.Log, caches *cache.Ca
res.tsConfigOverride, err = r.parseTSConfig(options.TSConfigPath, visited, fs.Dir(options.TSConfigPath))
} else {
source := logger.Source{
- KeyPath: logger.Path{Text: fs.Join(fs.Cwd(), "<tsconfig.json>"), Namespace: "file"},
- PrettyPath: "<tsconfig.json>",
- Contents: options.TSConfigRaw,
+ KeyPath: logger.Path{Text: fs.Join(fs.Cwd(), "<tsconfig.json>"), Namespace: "file"},
+ PrettyPaths: logger.PrettyPaths{Abs: "<tsconfig.json>", Rel: "<tsconfig.json>"},
+ Contents: options.TSConfigRaw,
}
res.tsConfigOverride, err = r.parseTSConfigFromSource(source, visited, fs.Cwd())
}
if err != nil {
if err == syscall.ENOENT {
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: options.TSConfigPath, Namespace: "file"})
r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot find tsconfig file %q",
- PrettyPath(r.fs, logger.Path{Text: options.TSConfigPath, Namespace: "file"})))
+ prettyPaths.Select(options.LogPathStyle)))
} else if err != errParseErrorAlreadyLogged {
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: options.TSConfigPath, Namespace: "file"})
r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot read file %q: %s",
- PrettyPath(r.fs, logger.Path{Text: options.TSConfigPath, Namespace: "file"}), err.Error()))
+ prettyPaths.Select(options.LogPathStyle), err.Error()))
}
} else {
r.flushDebugLogs(flushDueToSuccess)
@@ -1088,10 +1090,13 @@ func (r resolverQuery) resolveWithoutRemapping(sourceDirInfo *dirInfo, importPat
}
}
-func PrettyPath(fs fs.FS, path logger.Path) string {
+func MakePrettyPaths(fs fs.FS, path logger.Path) logger.PrettyPaths {
+ absPath := path.Text
+ relPath := path.Text
+
if path.Namespace == "file" {
- if rel, ok := fs.Rel(fs.Cwd(), path.Text); ok {
- path.Text = rel
+ if rel, ok := fs.Rel(fs.Cwd(), relPath); ok {
+ relPath = rel
}
// These human-readable paths are used in error messages, comments in output
@@ -1099,16 +1104,21 @@ func PrettyPath(fs fs.FS, path logger.Path) string {
// These should be platform-independent so our output doesn't depend on which
// operating system it was run. Replace Windows backward slashes with standard
// forward slashes.
- path.Text = strings.ReplaceAll(path.Text, "\\", "/")
+ relPath = strings.ReplaceAll(relPath, "\\", "/")
} else if path.Namespace != "" {
- path.Text = fmt.Sprintf("%s:%s", path.Namespace, path.Text)
+ absPath = fmt.Sprintf("%s:%s", path.Namespace, absPath)
+ relPath = fmt.Sprintf("%s:%s", path.Namespace, relPath)
}
if path.IsDisabled() {
- path.Text = "(disabled):" + path.Text
+ absPath = "(disabled):" + absPath
+ relPath = "(disabled):" + relPath
}
- return path.Text + path.IgnoredSuffix
+ return logger.PrettyPaths{
+ Abs: absPath + path.IgnoredSuffix,
+ Rel: relPath + path.IgnoredSuffix,
+ }
}
////////////////////////////////////////////////////////////////////////////////
@@ -1219,9 +1229,9 @@ func (r resolverQuery) parseTSConfig(file string, visited map[string]bool, confi
keyPath := logger.Path{Text: file, Namespace: "file"}
source := logger.Source{
- KeyPath: keyPath,
- PrettyPath: PrettyPath(r.fs, keyPath),
- Contents: contents,
+ KeyPath: keyPath,
+ PrettyPaths: MakePrettyPaths(r.fs, keyPath),
+ Contents: contents,
}
if visited != nil {
// This is only non-nil for "build" API calls. This is nil for "transform"
@@ -1271,9 +1281,9 @@ func (r resolverQuery) parseTSConfigFromSource(source logger.Source, visited map
r.log.AddID(logger.MsgID_TSConfigJSON_Cycle, logger.Warning, &tracker, extendsRange,
fmt.Sprintf("Base config file %q forms cycle", extends))
} else if err != errParseErrorAlreadyLogged {
- r.log.AddError(&tracker, extendsRange,
- fmt.Sprintf("Cannot read file %q: %s",
- PrettyPath(r.fs, logger.Path{Text: extendsFile, Namespace: "file"}), err.Error()))
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: extendsFile, Namespace: "file"})
+ r.log.AddError(&tracker, extendsRange, fmt.Sprintf("Cannot read file %q: %s",
+ prettyPaths.Select(r.options.LogPathStyle), err.Error()))
}
return nil, true
}
@@ -1555,9 +1565,9 @@ func (r resolverQuery) dirInfoUncached(path string) *dirInfo {
// list which contains such paths and treating them as missing means we just
// ignore them during path resolution.
if err != syscall.ENOENT && err != syscall.ENOTDIR {
- r.log.AddError(nil, logger.Range{},
- fmt.Sprintf("Cannot read directory %q: %s",
- PrettyPath(r.fs, logger.Path{Text: path, Namespace: "file"}), err.Error()))
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: path, Namespace: "file"})
+ r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot read directory %q: %s",
+ prettyPaths.Select(r.options.LogPathStyle), err.Error()))
}
return nil
}
@@ -1645,12 +1655,13 @@ func (r resolverQuery) dirInfoUncached(path string) *dirInfo {
info.enclosingTSConfigJSON, err = r.parseTSConfig(tsConfigPath, make(map[string]bool), r.fs.Dir(tsConfigPath))
if err != nil {
if err == syscall.ENOENT {
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: tsConfigPath, Namespace: "file"})
r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot find tsconfig file %q",
- PrettyPath(r.fs, logger.Path{Text: tsConfigPath, Namespace: "file"})))
+ prettyPaths.Select(r.options.LogPathStyle)))
} else if err != errParseErrorAlreadyLogged {
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: tsConfigPath, Namespace: "file"})
r.log.AddID(logger.MsgID_TSConfigJSON_Missing, logger.Debug, nil, logger.Range{},
- fmt.Sprintf("Cannot read file %q: %s",
- PrettyPath(r.fs, logger.Path{Text: tsConfigPath, Namespace: "file"}), err.Error()))
+ fmt.Sprintf("Cannot read file %q: %s", prettyPaths.Select(r.options.LogPathStyle), err.Error()))
}
}
}
@@ -1733,9 +1744,9 @@ func (r resolverQuery) loadAsFile(path string, extensionOrder []string) (string,
}
if err != nil {
if err != syscall.ENOENT {
- r.log.AddError(nil, logger.Range{},
- fmt.Sprintf("Cannot read directory %q: %s",
- PrettyPath(r.fs, logger.Path{Text: dirPath, Namespace: "file"}), err.Error()))
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: dirPath, Namespace: "file"})
+ r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot read directory %q: %s",
+ prettyPaths.Select(r.options.LogPathStyle), err.Error()))
}
return "", false, nil
}
@@ -2770,10 +2781,11 @@ func (r resolverQuery) finalizeImportsExportsResult(
fmt.Sprintf("The file %q is exported at path %q:", query, subpath)))
// Provide an inline suggestion message with the correct import path
+ prettyPaths := MakePrettyPaths(r.fs, absolute.Primary)
actualImportPath := path.Join(esmPackageName, subpath)
r.debugMeta.suggestionText = string(helpers.QuoteForJSON(actualImportPath, false))
r.debugMeta.suggestionMessage = fmt.Sprintf("Import from %q to get the file %q:",
- actualImportPath, PrettyPath(r.fs, absolute.Primary))
+ actualImportPath, prettyPaths.Select(r.options.LogPathStyle))
}
}
}
@@ -2788,11 +2800,12 @@ func (r resolverQuery) finalizeImportsExportsResult(
// Provide an inline suggestion message with the correct import path
if status == pjStatusModuleNotFoundMissingExtension {
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: r.fs.Join(absDirPath, resolvedPath+missingSuffix), Namespace: "file"})
actualImportPath := path.Join(esmPackageName, esmPackageSubpath+missingSuffix)
r.debugMeta.suggestionRange = suggestionRangeEnd
r.debugMeta.suggestionText = missingSuffix
r.debugMeta.suggestionMessage = fmt.Sprintf("Import from %q to get the file %q:",
- actualImportPath, PrettyPath(r.fs, logger.Path{Text: r.fs.Join(absDirPath, resolvedPath+missingSuffix), Namespace: "file"}))
+ actualImportPath, prettyPaths.Select(r.options.LogPathStyle))
}
case pjStatusUnsupportedDirectoryImport, pjStatusUnsupportedDirectoryImportMissingIndex:
@@ -2804,11 +2817,12 @@ func (r resolverQuery) finalizeImportsExportsResult(
// Provide an inline suggestion message with the correct import path
if status == pjStatusUnsupportedDirectoryImportMissingIndex {
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: r.fs.Join(absDirPath, resolvedPath+missingSuffix), Namespace: "file"})
actualImportPath := path.Join(esmPackageName, esmPackageSubpath+missingSuffix)
r.debugMeta.suggestionRange = suggestionRangeEnd
r.debugMeta.suggestionText = missingSuffix
r.debugMeta.suggestionMessage = fmt.Sprintf("Import from %q to get the file %q:",
- actualImportPath, PrettyPath(r.fs, logger.Path{Text: r.fs.Join(absDirPath, resolvedPath+missingSuffix), Namespace: "file"}))
+ actualImportPath, prettyPaths.Select(r.options.LogPathStyle))
}
case pjStatusUndefinedNoConditionsMatch:
diff --git internal/resolver/yarnpnp.go internal/resolver/yarnpnp.go
index 5a6a68dc875..bb8cd81b809 100644
--- internal/resolver/yarnpnp.go
+++ internal/resolver/yarnpnp.go
@@ -614,9 +614,9 @@ func (r resolverQuery) extractYarnPnPDataFromJSON(pnpDataPath string, mode pnpDa
}
if err != nil {
if mode == pnpReportErrorsAboutMissingFiles || err != syscall.ENOENT {
- r.log.AddError(nil, logger.Range{},
- fmt.Sprintf("Cannot read file %q: %s",
- PrettyPath(r.fs, logger.Path{Text: pnpDataPath, Namespace: "file"}), err.Error()))
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: pnpDataPath, Namespace: "file"})
+ r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot read file %q: %s",
+ prettyPaths.Select(r.options.LogPathStyle), err.Error()))
}
return
}
@@ -625,9 +625,9 @@ func (r resolverQuery) extractYarnPnPDataFromJSON(pnpDataPath string, mode pnpDa
}
keyPath := logger.Path{Text: pnpDataPath, Namespace: "file"}
source = logger.Source{
- KeyPath: keyPath,
- PrettyPath: PrettyPath(r.fs, keyPath),
- Contents: contents,
+ KeyPath: keyPath,
+ PrettyPaths: MakePrettyPaths(r.fs, keyPath),
+ Contents: contents,
}
result, _ = r.caches.JSONCache.Parse(r.log, source, js_parser.JSONOptions{})
return
@@ -640,9 +640,9 @@ func (r resolverQuery) tryToExtractYarnPnPDataFromJS(pnpDataPath string, mode pn
}
if err != nil {
if mode == pnpReportErrorsAboutMissingFiles || err != syscall.ENOENT {
- r.log.AddError(nil, logger.Range{},
- fmt.Sprintf("Cannot read file %q: %s",
- PrettyPath(r.fs, logger.Path{Text: pnpDataPath, Namespace: "file"}), err.Error()))
+ prettyPaths := MakePrettyPaths(r.fs, logger.Path{Text: pnpDataPath, Namespace: "file"})
+ r.log.AddError(nil, logger.Range{}, fmt.Sprintf("Cannot read file %q: %s",
+ prettyPaths.Select(r.options.LogPathStyle), err.Error()))
}
return
}
@@ -652,9 +652,9 @@ func (r resolverQuery) tryToExtractYarnPnPDataFromJS(pnpDataPath string, mode pn
keyPath := logger.Path{Text: pnpDataPath, Namespace: "file"}
source = logger.Source{
- KeyPath: keyPath,
- PrettyPath: PrettyPath(r.fs, keyPath),
- Contents: contents,
+ KeyPath: keyPath,
+ PrettyPaths: MakePrettyPaths(r.fs, keyPath),
+ Contents: contents,
}
ast, _ := r.caches.JSCache.Parse(r.log, source, js_parser.OptionsForYarnPnP())
diff --git internal/resolver/yarnpnp_test.go internal/resolver/yarnpnp_test.go
index bee1f7361ec..3df4bf699c8 100644
--- internal/resolver/yarnpnp_test.go
+++ internal/resolver/yarnpnp_test.go
@@ -47,9 +47,9 @@ func TestYarnPnP(t *testing.T) {
}
source := logger.Source{
- KeyPath: logger.Path{Text: path},
- PrettyPath: path,
- Contents: string(contents),
+ KeyPath: logger.Path{Text: path},
+ PrettyPaths: logger.PrettyPaths{Abs: path, Rel: path},
+ Contents: string(contents),
}
tempLog := logger.NewDeferLog(logger.DeferLogAll, nil)
expr, ok := js_parser.ParseJSON(tempLog, source, js_parser.JSONOptions{})
diff --git internal/runtime/runtime.go internal/runtime/runtime.go
index 42ccf39f538..8b4b2c1ae39 100644
--- internal/runtime/runtime.go
+++ internal/runtime/runtime.go
@@ -541,7 +541,7 @@ func Source(unsupportedJSFeatures compat.JSFeature) logger.Source {
return logger.Source{
Index: SourceIndex,
KeyPath: logger.Path{Text: "<runtime>"},
- PrettyPath: "<runtime>",
+ PrettyPaths: logger.PrettyPaths{Abs: "<runtime>", Rel: "<runtime>"},
IdentifierName: "runtime",
Contents: text,
}
diff --git internal/test/util.go internal/test/util.go
index 22089d44b63..d00585ed2b0 100644
--- internal/test/util.go
+++ internal/test/util.go
@@ -29,7 +29,7 @@ func SourceForTest(contents string) logger.Source {
return logger.Source{
Index: 0,
KeyPath: logger.Path{Text: "<stdin>"},
- PrettyPath: "<stdin>",
+ PrettyPaths: logger.PrettyPaths{Abs: "<stdin>", Rel: "<stdin>"},
Contents: contents,
IdentifierName: "stdin",
}
diff --git lib/shared/common.ts lib/shared/common.ts
index 49b9682a5ad..b4784baba67 100644
--- lib/shared/common.ts
+++ lib/shared/common.ts
@@ -174,6 +174,7 @@ function pushCommonFlags(flags: string[], options: CommonOptions, keys: OptionKe
let keepNames = getFlag(options, keys, 'keepNames', mustBeBoolean)
let platform = getFlag(options, keys, 'platform', mustBeString)
let tsconfigRaw = getFlag(options, keys, 'tsconfigRaw', mustBeStringOrObject)
+ let absPaths = getFlag(options, keys, 'absPaths', mustBeArrayOfStrings)
if (legalComments) flags.push(`--legal-comments=${legalComments}`)
if (sourceRoot !== void 0) flags.push(`--source-root=${sourceRoot}`)
@@ -194,6 +195,7 @@ function pushCommonFlags(flags: string[], options: CommonOptions, keys: OptionKe
if (ignoreAnnotations) flags.push(`--ignore-annotations`)
if (drop) for (let what of drop) flags.push(`--drop:${validateStringValue(what, 'drop')}`)
if (dropLabels) flags.push(`--drop-labels=${validateAndJoinStringArray(dropLabels, 'drop label')}`)
+ if (absPaths) flags.push(`--abs-paths=${validateAndJoinStringArray(absPaths, 'abs paths')}`)
if (mangleProps) flags.push(`--mangle-props=${jsRegExpToGoRegExp(mangleProps)}`)
if (reserveProps) flags.push(`--reserve-props=${jsRegExpToGoRegExp(reserveProps)}`)
if (mangleQuoted !== void 0) flags.push(`--mangle-quoted=${mangleQuoted}`)
diff --git lib/shared/types.ts lib/shared/types.ts
index 387a8290e1d..9e69c39f58b 100644
--- lib/shared/types.ts
+++ lib/shared/types.ts
@@ -4,6 +4,7 @@ export type Loader = 'base64' | 'binary' | 'copy' | 'css' | 'dataurl' | 'default
export type LogLevel = 'verbose' | 'debug' | 'info' | 'warning' | 'error' | 'silent'
export type Charset = 'ascii' | 'utf8'
export type Drop = 'console' | 'debugger'
+export type AbsPaths = 'code' | 'log' | 'metafile'
interface CommonOptions {
/** Documentation: https://esbuild.github.io/api/#sourcemap */
@@ -75,6 +76,8 @@ interface CommonOptions {
/** Documentation: https://esbuild.github.io/api/#keep-names */
keepNames?: boolean
+ /** Documentation: https://esbuild.github.io/api/#abs-paths */
+ absPaths?: AbsPaths[]
/** Documentation: https://esbuild.github.io/api/#color */
color?: boolean
/** Documentation: https://esbuild.github.io/api/#log-level */
diff --git npm/@esbuild/aix-ppc64/package.json npm/@esbuild/aix-ppc64/package.json
index 1ff07ce3a97..21589d464c2 100644
--- npm/@esbuild/aix-ppc64/package.json
+++ npm/@esbuild/aix-ppc64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/aix-ppc64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The IBM AIX PowerPC 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/android-arm/package.json npm/@esbuild/android-arm/package.json
index 3b7fafff7d0..95e01f2024d 100644
--- npm/@esbuild/android-arm/package.json
+++ npm/@esbuild/android-arm/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/android-arm",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "A WebAssembly shim for esbuild on Android ARM.",
"repository": {
"type": "git",
diff --git npm/@esbuild/android-arm64/package.json npm/@esbuild/android-arm64/package.json
index c3e982b797e..e0fa37d73d9 100644
--- npm/@esbuild/android-arm64/package.json
+++ npm/@esbuild/android-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/android-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Android ARM 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/android-x64/package.json npm/@esbuild/android-x64/package.json
index 9935c19fbe9..6a00e9ca27c 100644
--- npm/@esbuild/android-x64/package.json
+++ npm/@esbuild/android-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/android-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "A WebAssembly shim for esbuild on Android x64.",
"repository": {
"type": "git",
diff --git npm/@esbuild/darwin-arm64/package.json npm/@esbuild/darwin-arm64/package.json
index 27fa292b5c1..c7712251bf5 100644
--- npm/@esbuild/darwin-arm64/package.json
+++ npm/@esbuild/darwin-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/darwin-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The macOS ARM 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/darwin-x64/package.json npm/@esbuild/darwin-x64/package.json
index 42ccc8275c0..f639e72448e 100644
--- npm/@esbuild/darwin-x64/package.json
+++ npm/@esbuild/darwin-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/darwin-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The macOS 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/freebsd-arm64/package.json npm/@esbuild/freebsd-arm64/package.json
index 8fcd2d6c4a0..31da91aab9c 100644
--- npm/@esbuild/freebsd-arm64/package.json
+++ npm/@esbuild/freebsd-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/freebsd-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The FreeBSD ARM 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/freebsd-x64/package.json npm/@esbuild/freebsd-x64/package.json
index c6132375228..b3dbb2d1892 100644
--- npm/@esbuild/freebsd-x64/package.json
+++ npm/@esbuild/freebsd-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/freebsd-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The FreeBSD 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-arm/package.json npm/@esbuild/linux-arm/package.json
index 82efa41626a..4fbfee91a2e 100644
--- npm/@esbuild/linux-arm/package.json
+++ npm/@esbuild/linux-arm/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-arm",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux ARM binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-arm64/package.json npm/@esbuild/linux-arm64/package.json
index 6c01c56647f..ba1a3ab7e45 100644
--- npm/@esbuild/linux-arm64/package.json
+++ npm/@esbuild/linux-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux ARM 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-ia32/package.json npm/@esbuild/linux-ia32/package.json
index 5a4978f60b2..b60de05ba93 100644
--- npm/@esbuild/linux-ia32/package.json
+++ npm/@esbuild/linux-ia32/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-ia32",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux 32-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-loong64/package.json npm/@esbuild/linux-loong64/package.json
index a4b13875f76..47e77ed9249 100644
--- npm/@esbuild/linux-loong64/package.json
+++ npm/@esbuild/linux-loong64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-loong64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux LoongArch 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-mips64el/package.json npm/@esbuild/linux-mips64el/package.json
index e570e9cf624..d99861bea90 100644
--- npm/@esbuild/linux-mips64el/package.json
+++ npm/@esbuild/linux-mips64el/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-mips64el",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux MIPS 64-bit Little Endian binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-ppc64/package.json npm/@esbuild/linux-ppc64/package.json
index 907990d0591..e4e5d517e9f 100644
--- npm/@esbuild/linux-ppc64/package.json
+++ npm/@esbuild/linux-ppc64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-ppc64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux PowerPC 64-bit Little Endian binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-riscv64/package.json npm/@esbuild/linux-riscv64/package.json
index dd74f0c3aca..51489cd0ce7 100644
--- npm/@esbuild/linux-riscv64/package.json
+++ npm/@esbuild/linux-riscv64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-riscv64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux RISC-V 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-s390x/package.json npm/@esbuild/linux-s390x/package.json
index acb0b50ca23..05d045263ab 100644
--- npm/@esbuild/linux-s390x/package.json
+++ npm/@esbuild/linux-s390x/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-s390x",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux IBM Z 64-bit Big Endian binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/linux-x64/package.json npm/@esbuild/linux-x64/package.json
index 79b3b439606..db4f66f64aa 100644
--- npm/@esbuild/linux-x64/package.json
+++ npm/@esbuild/linux-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/linux-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Linux 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/netbsd-arm64/package.json npm/@esbuild/netbsd-arm64/package.json
index 3bf85903c38..24389a313c5 100644
--- npm/@esbuild/netbsd-arm64/package.json
+++ npm/@esbuild/netbsd-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/netbsd-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The NetBSD ARM 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/netbsd-x64/package.json npm/@esbuild/netbsd-x64/package.json
index ea1b761eaa5..bde4942f47e 100644
--- npm/@esbuild/netbsd-x64/package.json
+++ npm/@esbuild/netbsd-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/netbsd-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The NetBSD AMD64 binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/openbsd-arm64/package.json npm/@esbuild/openbsd-arm64/package.json
index 327737adc24..1c3a48ce9ef 100644
--- npm/@esbuild/openbsd-arm64/package.json
+++ npm/@esbuild/openbsd-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/openbsd-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The OpenBSD ARM 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/openbsd-x64/package.json npm/@esbuild/openbsd-x64/package.json
index 2319ea72ae8..cd479c24565 100644
--- npm/@esbuild/openbsd-x64/package.json
+++ npm/@esbuild/openbsd-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/openbsd-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The OpenBSD 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/openharmony-arm64/package.json npm/@esbuild/openharmony-arm64/package.json
index 83b74f1baea..02582cd7f9e 100644
--- npm/@esbuild/openharmony-arm64/package.json
+++ npm/@esbuild/openharmony-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/openharmony-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "A WebAssembly shim for esbuild on OpenHarmony ARM64.",
"repository": {
"type": "git",
diff --git npm/@esbuild/sunos-x64/package.json npm/@esbuild/sunos-x64/package.json
index f6751268dba..7933da39163 100644
--- npm/@esbuild/sunos-x64/package.json
+++ npm/@esbuild/sunos-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/sunos-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The illumos 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/wasi-preview1/package.json npm/@esbuild/wasi-preview1/package.json
index 4b3e05ff834..71bedb5d8ff 100644
--- npm/@esbuild/wasi-preview1/package.json
+++ npm/@esbuild/wasi-preview1/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/wasi-preview1",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The WASI (WebAssembly System Interface) preview 1 binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/win32-arm64/package.json npm/@esbuild/win32-arm64/package.json
index e975c510a42..50e4a1f152a 100644
--- npm/@esbuild/win32-arm64/package.json
+++ npm/@esbuild/win32-arm64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/win32-arm64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Windows ARM 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/win32-ia32/package.json npm/@esbuild/win32-ia32/package.json
index 0bbdb20708e..8b8880cac5e 100644
--- npm/@esbuild/win32-ia32/package.json
+++ npm/@esbuild/win32-ia32/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/win32-ia32",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Windows 32-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/@esbuild/win32-x64/package.json npm/@esbuild/win32-x64/package.json
index fba3ad7869c..2221275fd5e 100644
--- npm/@esbuild/win32-x64/package.json
+++ npm/@esbuild/win32-x64/package.json
@@ -1,6 +1,6 @@
{
"name": "@esbuild/win32-x64",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The Windows 64-bit binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/esbuild-wasm/package.json npm/esbuild-wasm/package.json
index f366a9db962..895031be1b6 100644
--- npm/esbuild-wasm/package.json
+++ npm/esbuild-wasm/package.json
@@ -1,6 +1,6 @@
{
"name": "esbuild-wasm",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "The cross-platform WebAssembly binary for esbuild, a JavaScript bundler.",
"repository": {
"type": "git",
diff --git npm/esbuild/package.json npm/esbuild/package.json
index 58bb7262633..3d234be8f72 100644
--- npm/esbuild/package.json
+++ npm/esbuild/package.json
@@ -1,6 +1,6 @@
{
"name": "esbuild",
- "version": "0.25.6",
+ "version": "0.25.8",
"description": "An extremely fast JavaScript and CSS bundler and minifier.",
"repository": {
"type": "git",
@@ -18,32 +18,32 @@
"esbuild": "bin/esbuild"
},
"optionalDependencies": {
- "@esbuild/aix-ppc64": "0.25.6",
- "@esbuild/android-arm": "0.25.6",
- "@esbuild/android-arm64": "0.25.6",
- "@esbuild/android-x64": "0.25.6",
- "@esbuild/darwin-arm64": "0.25.6",
- "@esbuild/darwin-x64": "0.25.6",
- "@esbuild/freebsd-arm64": "0.25.6",
- "@esbuild/freebsd-x64": "0.25.6",
- "@esbuild/linux-arm": "0.25.6",
- "@esbuild/linux-arm64": "0.25.6",
- "@esbuild/linux-ia32": "0.25.6",
- "@esbuild/linux-loong64": "0.25.6",
- "@esbuild/linux-mips64el": "0.25.6",
- "@esbuild/linux-ppc64": "0.25.6",
- "@esbuild/linux-riscv64": "0.25.6",
- "@esbuild/linux-s390x": "0.25.6",
- "@esbuild/linux-x64": "0.25.6",
- "@esbuild/netbsd-arm64": "0.25.6",
- "@esbuild/netbsd-x64": "0.25.6",
- "@esbuild/openbsd-arm64": "0.25.6",
- "@esbuild/openbsd-x64": "0.25.6",
- "@esbuild/openharmony-arm64": "0.25.6",
- "@esbuild/sunos-x64": "0.25.6",
- "@esbuild/win32-arm64": "0.25.6",
- "@esbuild/win32-ia32": "0.25.6",
- "@esbuild/win32-x64": "0.25.6"
+ "@esbuild/aix-ppc64": "0.25.8",
+ "@esbuild/android-arm": "0.25.8",
+ "@esbuild/android-arm64": "0.25.8",
+ "@esbuild/android-x64": "0.25.8",
+ "@esbuild/darwin-arm64": "0.25.8",
+ "@esbuild/darwin-x64": "0.25.8",
+ "@esbuild/freebsd-arm64": "0.25.8",
+ "@esbuild/freebsd-x64": "0.25.8",
+ "@esbuild/linux-arm": "0.25.8",
+ "@esbuild/linux-arm64": "0.25.8",
+ "@esbuild/linux-ia32": "0.25.8",
+ "@esbuild/linux-loong64": "0.25.8",
+ "@esbuild/linux-mips64el": "0.25.8",
+ "@esbuild/linux-ppc64": "0.25.8",
+ "@esbuild/linux-riscv64": "0.25.8",
+ "@esbuild/linux-s390x": "0.25.8",
+ "@esbuild/linux-x64": "0.25.8",
+ "@esbuild/netbsd-arm64": "0.25.8",
+ "@esbuild/netbsd-x64": "0.25.8",
+ "@esbuild/openbsd-arm64": "0.25.8",
+ "@esbuild/openbsd-x64": "0.25.8",
+ "@esbuild/openharmony-arm64": "0.25.8",
+ "@esbuild/sunos-x64": "0.25.8",
+ "@esbuild/win32-arm64": "0.25.8",
+ "@esbuild/win32-ia32": "0.25.8",
+ "@esbuild/win32-x64": "0.25.8"
},
"license": "MIT"
}
diff --git pkg/api/api.go pkg/api/api.go
index 3399801fba6..7be6945e182 100644
--- pkg/api/api.go
+++ pkg/api/api.go
@@ -265,6 +265,14 @@ const (
MangleQuotedTrue
)
+type AbsPaths uint8
+
+const (
+ CodeAbsPath AbsPaths = 1 << iota
+ LogAbsPath
+ MetafileAbsPath
+)
+
////////////////////////////////////////////////////////////////////////////////
// Build API
@@ -273,6 +281,7 @@ type BuildOptions struct {
LogLevel LogLevel // Documentation: https://esbuild.github.io/api/#log-level
LogLimit int // Documentation: https://esbuild.github.io/api/#log-limit
LogOverride map[string]LogLevel // Documentation: https://esbuild.github.io/api/#log-override
+ AbsPaths AbsPaths // Documentation: https://esbuild.github.io/api/#abs-path
Sourcemap SourceMap // Documentation: https://esbuild.github.io/api/#sourcemap
SourceRoot string // Documentation: https://esbuild.github.io/api/#source-root
@@ -404,6 +413,7 @@ type TransformOptions struct {
LogLevel LogLevel // Documentation: https://esbuild.github.io/api/#log-level
LogLimit int // Documentation: https://esbuild.github.io/api/#log-limit
LogOverride map[string]LogLevel // Documentation: https://esbuild.github.io/api/#log-override
+ AbsPaths AbsPaths // Documentation: https://esbuild.github.io/api/#abs-path
Sourcemap SourceMap // Documentation: https://esbuild.github.io/api/#sourcemap
SourceRoot string // Documentation: https://esbuild.github.io/api/#source-root
diff --git pkg/api/api_impl.go pkg/api/api_impl.go
index f7232fa66e7..4c790f7661c 100644
--- pkg/api/api_impl.go
+++ pkg/api/api_impl.go
@@ -287,6 +287,14 @@ func validateLoader(value Loader) config.Loader {
}
}
+func extractPathStyle(absPaths AbsPaths, flag AbsPaths) logger.PathStyle {
+ if (absPaths & flag) != 0 {
+ return logger.AbsPath
+ } else {
+ return logger.RelPath
+ }
+}
+
var versionRegex = regexp.MustCompile(`^([0-9]+)(?:\.([0-9]+))?(?:\.([0-9]+))?(-[A-Za-z0-9]+(?:\.[A-Za-z0-9]+)*)?$`)
func validateFeatures(log logger.Log, target Target, engines []Engine) (compat.JSFeature, compat.CSSFeature, map[css_ast.D]compat.CSSPrefix, string) {
@@ -389,9 +397,9 @@ func validateSupported(log logger.Log, supported map[string]bool) (
func validateGlobalName(log logger.Log, text string, path string) []string {
if text != "" {
source := logger.Source{
- KeyPath: logger.Path{Text: path},
- PrettyPath: path,
- Contents: text,
+ KeyPath: logger.Path{Text: path},
+ PrettyPaths: logger.PrettyPaths{Abs: path, Rel: path},
+ Contents: text,
}
if result, ok := js_parser.ParseGlobalName(log, source); ok {
@@ -583,7 +591,7 @@ func validateDefines(
switch logger.API {
case logger.CLIAPI:
data.Location = &logger.MsgLocation{
- File: "<cli>",
+ File: logger.PrettyPaths{Abs: "<cli>", Rel: "<cli>"},
Line: 1,
Column: 30,
Length: len(part),
@@ -593,7 +601,7 @@ func validateDefines(
case logger.JSAPI:
data.Location = &logger.MsgLocation{
- File: "<js>",
+ File: logger.PrettyPaths{Abs: "<js>", Rel: "<js>"},
Line: 1,
Column: 34,
Length: len(part) + 2,
@@ -603,7 +611,7 @@ func validateDefines(
case logger.GoAPI:
data.Location = &logger.MsgLocation{
- File: "<go>",
+ File: logger.PrettyPaths{Abs: "<go>", Rel: "<go>"},
Line: 1,
Column: 50,
Length: len(part) + 2,
@@ -751,10 +759,10 @@ func validateKeepNames(log logger.Log, options *config.Options) {
}
}
-func convertLocationToPublic(loc *logger.MsgLocation) *Location {
+func convertLocationToPublic(loc *logger.MsgLocation, pathStyle logger.PathStyle) *Location {
if loc != nil {
return &Location{
- File: loc.File,
+ File: loc.File.Select(pathStyle),
Namespace: loc.Namespace,
Line: loc.Line,
Column: loc.Column,
@@ -766,7 +774,7 @@ func convertLocationToPublic(loc *logger.MsgLocation) *Location {
return nil
}
-func convertMessagesToPublic(kind logger.MsgKind, msgs []logger.Msg) []Message {
+func convertMessagesToPublic(kind logger.MsgKind, msgs []logger.Msg, pathStyle logger.PathStyle) []Message {
var filtered []Message
for _, msg := range msgs {
if msg.Kind == kind {
@@ -774,14 +782,14 @@ func convertMessagesToPublic(kind logger.MsgKind, msgs []logger.Msg) []Message {
for _, note := range msg.Notes {
notes = append(notes, Note{
Text: note.Text,
- Location: convertLocationToPublic(note.Location),
+ Location: convertLocationToPublic(note.Location, pathStyle),
})
}
filtered = append(filtered, Message{
ID: logger.MsgIDToString(msg.ID),
PluginName: msg.PluginName,
Text: msg.Data.Text,
- Location: convertLocationToPublic(msg.Data.Location),
+ Location: convertLocationToPublic(msg.Data.Location, pathStyle),
Notes: notes,
Detail: msg.Data.UserDetail,
})
@@ -797,7 +805,7 @@ func convertLocationToInternal(loc *Location) *logger.MsgLocation {
namespace = "file"
}
return &logger.MsgLocation{
- File: loc.File,
+ File: logger.PrettyPaths{Abs: loc.File, Rel: loc.File},
Namespace: namespace,
Line: loc.Line,
Column: loc.Column,
@@ -874,6 +882,7 @@ func contextImpl(buildOpts BuildOptions) (*internalContext, []Message) {
MessageLimit: buildOpts.LogLimit,
Color: validateColor(buildOpts.Color),
LogLevel: validateLogLevel(buildOpts.LogLevel),
+ PathStyle: extractPathStyle(buildOpts.AbsPaths, LogAbsPath),
Overrides: validateLogOverrides(buildOpts.LogOverride),
}
@@ -890,7 +899,7 @@ func contextImpl(buildOpts BuildOptions) (*internalContext, []Message) {
if err != nil {
log := logger.NewStderrLog(logOptions)
log.AddError(nil, logger.Range{}, err.Error())
- return nil, convertMessagesToPublic(logger.Error, log.Done())
+ return nil, convertMessagesToPublic(logger.Error, log.Done(), logOptions.PathStyle)
}
// Do not re-evaluate plugins when rebuilding. Also make sure the working
@@ -920,7 +929,7 @@ func contextImpl(buildOpts BuildOptions) (*internalContext, []Message) {
}
stderr.Done()
}
- return nil, convertMessagesToPublic(logger.Error, msgs)
+ return nil, convertMessagesToPublic(logger.Error, msgs, options.LogPathStyle)
}
args := rebuildArgs{
@@ -1078,6 +1087,7 @@ func (ctx *internalContext) Watch(options WatchOptions) error {
fs: ctx.realFS,
shouldLog: logLevel == logger.LevelInfo || logLevel == logger.LevelDebug || logLevel == logger.LevelVerbose,
useColor: ctx.args.logOptions.Color,
+ pathStyle: ctx.args.logOptions.PathStyle,
rebuild: func() fs.WatchData {
return ctx.rebuild().watchData
},
@@ -1294,6 +1304,9 @@ func validateBuildOptions(
MainFields: buildOpts.MainFields,
PublicPath: buildOpts.PublicPath,
KeepNames: buildOpts.KeepNames,
+ CodePathStyle: extractPathStyle(buildOpts.AbsPaths, CodeAbsPath),
+ LogPathStyle: extractPathStyle(buildOpts.AbsPaths, LogAbsPath),
+ MetafilePathStyle: extractPathStyle(buildOpts.AbsPaths, MetafileAbsPath),
InjectPaths: append([]string{}, buildOpts.Inject...),
AbsNodePaths: make([]string, len(buildOpts.NodePaths)),
JSBanner: bannerJS,
@@ -1597,8 +1610,8 @@ func rebuildImpl(args rebuildArgs, oldHashes map[string]string) (rebuildState, m
// Populate the result object with the messages so far
msgs := log.Peek()
- result.Errors = convertMessagesToPublic(logger.Error, msgs)
- result.Warnings = convertMessagesToPublic(logger.Warning, msgs)
+ result.Errors = convertMessagesToPublic(logger.Error, msgs, args.options.LogPathStyle)
+ result.Warnings = convertMessagesToPublic(logger.Warning, msgs, args.options.LogPathStyle)
// Run any registered "OnEnd" callbacks now. These always run regardless of
// whether the current build has bee canceled or not. They can check for
@@ -1675,6 +1688,7 @@ func transformImpl(input string, transformOpts TransformOptions) TransformResult
MessageLimit: transformOpts.LogLimit,
Color: validateColor(transformOpts.Color),
LogLevel: validateLogLevel(transformOpts.LogLevel),
+ PathStyle: extractPathStyle(transformOpts.AbsPaths, LogAbsPath),
Overrides: validateLogOverrides(transformOpts.LogOverride),
})
caches := cache.MakeCacheSet()
@@ -1735,6 +1749,9 @@ func transformImpl(input string, transformOpts TransformOptions) TransformResult
TreeShaking: validateTreeShaking(transformOpts.TreeShaking, false /* bundle */, transformOpts.Format),
AbsOutputFile: transformOpts.Sourcefile + "-out",
KeepNames: transformOpts.KeepNames,
+ CodePathStyle: extractPathStyle(transformOpts.AbsPaths, CodeAbsPath),
+ LogPathStyle: extractPathStyle(transformOpts.AbsPaths, LogAbsPath),
+ MetafilePathStyle: extractPathStyle(transformOpts.AbsPaths, MetafileAbsPath),
Stdin: &config.StdinInfo{
Loader: validateLoader(transformOpts.Loader),
Contents: input,
@@ -1823,8 +1840,8 @@ func transformImpl(input string, transformOpts TransformOptions) TransformResult
msgs := log.Done()
return TransformResult{
- Errors: convertMessagesToPublic(logger.Error, msgs),
- Warnings: convertMessagesToPublic(logger.Warning, msgs),
+ Errors: convertMessagesToPublic(logger.Error, msgs, options.LogPathStyle),
+ Warnings: convertMessagesToPublic(logger.Warning, msgs, options.LogPathStyle),
Code: code,
Map: sourceMap,
LegalComments: legalComments,
@@ -2087,8 +2104,8 @@ func loadPlugins(initialOptions *BuildOptions, fs fs.FS, log logger.Log, caches
absResolveDir := validatePath(log, fs, options.ResolveDir, "resolve directory")
if log.HasErrors() {
msgs := log.Done()
- result.Errors = convertMessagesToPublic(logger.Error, msgs)
- result.Warnings = convertMessagesToPublic(logger.Warning, msgs)
+ result.Errors = convertMessagesToPublic(logger.Error, msgs, optionsClone.LogPathStyle)
+ result.Warnings = convertMessagesToPublic(logger.Warning, msgs, optionsClone.LogPathStyle)
return
}
@@ -2108,12 +2125,13 @@ func loadPlugins(initialOptions *BuildOptions, fs fs.FS, log logger.Log, caches
kind,
absResolveDir,
options.PluginData,
+ optionsClone.LogPathStyle,
)
msgs := log.Done()
// Populate the result
- result.Errors = convertMessagesToPublic(logger.Error, msgs)
- result.Warnings = convertMessagesToPublic(logger.Warning, msgs)
+ result.Errors = convertMessagesToPublic(logger.Error, msgs, optionsClone.LogPathStyle)
+ result.Warnings = convertMessagesToPublic(logger.Warning, msgs, optionsClone.LogPathStyle)
if resolveResult != nil {
result.Path = resolveResult.PathPair.Primary.Text
result.External = resolveResult.PathPair.IsExternal
@@ -2127,11 +2145,13 @@ func loadPlugins(initialOptions *BuildOptions, fs fs.FS, log logger.Log, caches
if options.PluginName != "" {
pluginName = options.PluginName
}
- text, _, notes := bundler.ResolveFailureErrorTextSuggestionNotes(resolver, path, kind, pluginName, fs, absResolveDir, optionsForResolve.Platform, "", "")
+ text, _, notes := bundler.ResolveFailureErrorTextSuggestionNotes(
+ resolver, path, kind, pluginName, fs, absResolveDir, optionsForResolve.Platform,
+ logger.PrettyPaths{}, "", optionsClone.LogPathStyle)
result.Errors = append(result.Errors, convertMessagesToPublic(logger.Error, []logger.Msg{{
Data: logger.MsgData{Text: text},
Notes: notes,
- }})...)
+ }}, optionsClone.LogPathStyle)...)
}
return
}
diff --git pkg/api/watcher.go pkg/api/watcher.go
index 6475e1464d9..f7d7079aae4 100644
--- pkg/api/watcher.go
+++ pkg/api/watcher.go
@@ -59,6 +59,7 @@ type watcher struct {
shouldStop int32
shouldLog bool
useColor logger.UseColor
+ pathStyle logger.PathStyle
stopWaitGroup sync.WaitGroup
}
@@ -112,8 +113,9 @@ func (w *watcher) start() {
if w.shouldLog {
logger.PrintTextWithColor(os.Stderr, w.useColor, func(colors logger.Colors) string {
- prettyPath := resolver.PrettyPath(w.fs, logger.Path{Text: absPath, Namespace: "file"})
- return fmt.Sprintf("%s[watch] build started (change: %q)%s\n", colors.Dim, prettyPath, colors.Reset)
+ prettyPaths := resolver.MakePrettyPaths(w.fs, logger.Path{Text: absPath, Namespace: "file"})
+ return fmt.Sprintf("%s[watch] build started (change: %q)%s\n",
+ colors.Dim, prettyPaths.Select(w.pathStyle), colors.Reset)
})
}
diff --git pkg/cli/cli_impl.go pkg/cli/cli_impl.go
index 081609bce2d..e6f1971e6f4 100644
--- pkg/cli/cli_impl.go
+++ pkg/cli/cli_impl.go
@@ -485,6 +485,30 @@ func parseOptionsImpl(
transformOpts.LogOverride[value[:equals]] = logLevel
}
+ case strings.HasPrefix(arg, "--abs-paths="):
+ values := splitWithEmptyCheck(arg[len("--abs-paths="):], ",")
+ var absPaths api.AbsPaths
+ for _, value := range values {
+ switch value {
+ case "code":
+ absPaths |= api.CodeAbsPath
+ case "log":
+ absPaths |= api.LogAbsPath
+ case "metafile":
+ absPaths |= api.MetafileAbsPath
+ default:
+ return parseOptionsExtras{}, cli_helpers.MakeErrorWithNote(
+ fmt.Sprintf("Invalid value %q in %q", value, arg),
+ "Valid values are \"code\", \"log\", or \"metafile\".",
+ )
+ }
+ }
+ if buildOpts != nil {
+ buildOpts.AbsPaths = absPaths
+ } else {
+ transformOpts.AbsPaths = absPaths
+ }
+
case strings.HasPrefix(arg, "--supported:"):
value := arg[len("--supported:"):]
equals := strings.IndexByte(value, '=')
@@ -848,6 +872,7 @@ func parseOptionsImpl(
}
equals := map[string]bool{
+ "abs-paths": true,
"allow-overwrite": true,
"asset-names": true,
"banner": true,
diff --git pkg/cli/mangle_cache.go pkg/cli/mangle_cache.go
index 13c64113fef..05fe75e42ef 100644
--- pkg/cli/mangle_cache.go
+++ pkg/cli/mangle_cache.go
@@ -17,6 +17,7 @@ import (
"github.com/evanw/esbuild/internal/js_lexer"
"github.com/evanw/esbuild/internal/js_parser"
"github.com/evanw/esbuild/internal/logger"
+ "github.com/evanw/esbuild/internal/resolver"
)
func parseMangleCache(osArgs []string, fs fs.FS, absPath string) (map[string]interface{}, []string) {
@@ -44,10 +45,11 @@ func parseMangleCache(osArgs []string, fs fs.FS, absPath string) (map[string]int
}
// Use our JSON parser so we get pretty-printed error messages
+ keyPath := logger.Path{Text: absPath, Namespace: "file"}
source := logger.Source{
- KeyPath: logger.Path{Text: absPath, Namespace: "file"},
- PrettyPath: prettyPath,
- Contents: string(bytes),
+ KeyPath: keyPath,
+ PrettyPaths: resolver.MakePrettyPaths(fs, keyPath),
+ Contents: string(bytes),
}
result, ok := js_parser.ParseJSON(log, source, js_parser.JSONOptions{})
if !ok || log.HasErrors() {
diff --git scripts/end-to-end-tests.js scripts/end-to-end-tests.js
index 956ffaaa5bd..3e22eac3d4a 100644
--- scripts/end-to-end-tests.js
+++ scripts/end-to-end-tests.js
@@ -59,6 +59,21 @@ tests.push(
}),
)
+// Test absolute paths in log messages
+tests.push(
+ test(['entry.js', '--bundle', '--abs-paths=log'], {
+ 'entry.js': `import "./foo"`,
+ }, {
+ expectedStderr: `${errorIcon} [ERROR] Could not resolve "./foo"
+
+ $ABS_PATH_PREFIX$entry.js:1:7:
+ 1 │ import "./foo"
+ ╵ ~~~~~~~
+
+`,
+ }),
+)
+
// Test resolving paths with a question mark (an invalid path on Windows)
tests.push(
test(['entry.js', '--bundle', '--outfile=node.js'], {
@@ -9183,7 +9198,7 @@ function test(args, files, options) {
const hasCJS = args.includes('--format=cjs')
const hasESM = args.includes('--format=esm')
const formats = hasIIFE ? ['iife'] : hasESM ? ['esm'] : hasCJS || !hasBundle ? ['cjs'] : ['cjs', 'esm']
- const expectedStderr = options && options.expectedStderr || '';
+ const baseExpectedStderr = (options && options.expectedStderr || '')
// If the test doesn't specify a format, test both formats
for (const format of formats) {
@@ -9191,6 +9206,8 @@ function test(args, files, options) {
const logLevelArgs = args.some(arg => arg.startsWith('--log-level=')) ? [] : ['--log-level=warning']
const modifiedArgs = (!hasBundle || args.includes(formatArg) ? args : args.concat(formatArg)).concat(logLevelArgs)
const thisTestDir = path.join(testDir, '' + testCount++)
+ const patchString = str => str.replace('$ABS_PATH_PREFIX$', path.join(thisTestDir, 'x').slice(0, -1))
+ const expectedStderr = Array.isArray(baseExpectedStderr) ? baseExpectedStderr.map(patchString) : patchString(baseExpectedStderr)
await fs.mkdir(thisTestDir, { recursive: true })
try {
diff --git scripts/js-api-tests.js scripts/js-api-tests.js
index 93a896ddc93..f8dfc49e299 100644
--- scripts/js-api-tests.js
+++ scripts/js-api-tests.js
@@ -143,6 +143,108 @@ let buildTests = {
assert.strictEqual(require(bOut).y, true)
},
+ async absPathsCodeTest({ esbuild, testDir }) {
+ let srcDir = path.join(testDir, 'src');
+ let outfile = path.join(testDir, 'out', 'result.js');
+ let entry = path.join(srcDir, 'entry.js');
+ fs.mkdirSync(srcDir, { recursive: true });
+ fs.writeFileSync(entry, `x = typeof y == "null"`);
+
+ const { metafile, warnings, outputFiles } = await esbuild.build({
+ entryPoints: [entry],
+ outfile,
+ bundle: true,
+ write: false,
+ metafile: true,
+ format: 'cjs',
+ logLevel: 'silent',
+ absPaths: ['code'],
+ })
+ const cwd = process.cwd()
+ const makePath = absPath => path.relative(cwd, absPath).split(path.sep).join('/')
+
+ assert.strictEqual(outputFiles.length, 1)
+ assert.deepStrictEqual(outputFiles[0].path, outfile)
+ assert.deepStrictEqual(outputFiles[0].text, `// ${entry}\nx = typeof y == "null";\n`)
+
+ assert.deepStrictEqual(Object.keys(metafile.inputs), [makePath(entry)])
+ assert.deepStrictEqual(Object.keys(metafile.outputs), [makePath(outfile)])
+ assert.strictEqual(metafile.inputs[makePath(entry)].imports.length, 0)
+ assert.strictEqual(metafile.outputs[makePath(outfile)].entryPoint, makePath(entry))
+
+ assert.strictEqual(warnings.length, 1)
+ assert.strictEqual(warnings[0].text, 'The "typeof" operator will never evaluate to "null"')
+ assert.strictEqual(warnings[0].location.file, makePath(entry))
+ },
+
+ async absPathsLogTest({ esbuild, testDir }) {
+ let srcDir = path.join(testDir, 'src');
+ let outfile = path.join(testDir, 'out', 'result.js');
+ let entry = path.join(srcDir, 'entry.js');
+ fs.mkdirSync(srcDir, { recursive: true });
+ fs.writeFileSync(entry, `x = typeof y == "null"`);
+
+ const { metafile, warnings, outputFiles } = await esbuild.build({
+ entryPoints: [entry],
+ outfile,
+ bundle: true,
+ write: false,
+ metafile: true,
+ format: 'cjs',
+ logLevel: 'silent',
+ absPaths: ['log'],
+ })
+ const cwd = process.cwd()
+ const makePath = absPath => path.relative(cwd, absPath).split(path.sep).join('/')
+
+ assert.strictEqual(outputFiles.length, 1)
+ assert.deepStrictEqual(outputFiles[0].path, outfile)
+ assert.deepStrictEqual(outputFiles[0].text, `// ${makePath(entry)}\nx = typeof y == "null";\n`)
+
+ assert.deepStrictEqual(Object.keys(metafile.inputs), [makePath(entry)])
+ assert.deepStrictEqual(Object.keys(metafile.outputs), [makePath(outfile)])
+ assert.strictEqual(metafile.inputs[makePath(entry)].imports.length, 0)
+ assert.strictEqual(metafile.outputs[makePath(outfile)].entryPoint, makePath(entry))
+
+ assert.strictEqual(warnings.length, 1)
+ assert.strictEqual(warnings[0].text, 'The "typeof" operator will never evaluate to "null"')
+ assert.strictEqual(warnings[0].location.file, entry)
+ },
+
+ async absPathsMetafileTest({ esbuild, testDir }) {
+ let srcDir = path.join(testDir, 'src');
+ let outfile = path.join(testDir, 'out', 'result.js');
+ let entry = path.join(srcDir, 'entry.js');
+ fs.mkdirSync(srcDir, { recursive: true });
+ fs.writeFileSync(entry, `x = typeof y == "null"`);
+
+ const { metafile, warnings, outputFiles } = await esbuild.build({
+ entryPoints: [entry],
+ outfile,
+ bundle: true,
+ write: false,
+ metafile: true,
+ format: 'cjs',
+ logLevel: 'silent',
+ absPaths: ['metafile'],
+ })
+ const cwd = process.cwd()
+ const makePath = absPath => path.relative(cwd, absPath).split(path.sep).join('/')
+
+ assert.strictEqua,l(outputFiles.length, 1)
+ assert.deepStrictEqual(outputFiles[0].path, outfile)
+ assert.deepStrictEqual(outputFiles[0].text, `// ${makePath(entry)}\nx = typeof y == "null";\n`)
+
+ assert.deepStrictEqual(Object.keys(metafile.inputs), [entry])
+ assert.deepStrictEqual(Object.keys(metafile.outputs), [outfile])
+ assert.strictEqual(metafile.inputs[entry].imports.length, 0)
+ assert.strictEqual(metafile.outputs[outfile].entryPoint, entry)
+
+ assert.strictEqual(warnings.length, 1)
+ assert.strictEqual(warnings[0].text, 'The "typeof" operator will never evaluate to "null"')
+ assert.strictEqual(warnings[0].location.file, makePath(entry))
+ },
+
async aliasValidity({ esbuild }) {
const valid = async alias => {
const result = await esbuild.build({
diff --git version.txt version.txt
index 3f44db947dc..783dfe1f20a 100644
--- version.txt
+++ version.txt
@@ -1 +1 @@
-0.25.6
+0.25.8
DescriptionThis PR introduces two major releases (0.25.7 and 0.25.8) for esbuild that add support for JavaScript import phases ( The main changes include:
ChangesChangesCHANGELOG.md
cmd/esbuild/main.go & cmd/esbuild/service.go & cmd/esbuild/version.go
internal/ast/ast.go
internal/bundler/bundler.go
internal/js_ast/js_ast.go
internal/js_parser/js_parser.go
internal/js_parser/ts_parser.go
internal/js_printer/js_printer.go
internal/config/config.go & internal/logger/logger.go
internal/resolver/resolver.go
Test files
Package files
Possible Issues
Security Hotspots
|
This PR contains the following updates:
0.25.6
->0.25.8
Release Notes
evanw/esbuild (esbuild)
v0.25.8
Compare Source
Fix another TypeScript parsing edge case (#4248)
This fixes a regression with a change in the previous release that tries to more accurately parse TypeScript arrow functions inside the
?:
operator. The regression specifically involves parsing an arrow function containing a#private
identifier inside the middle of a?:
ternary operator inside a class body. This was fixed by propagating private identifier state into the parser clone used to speculatively parse the arrow function body. Here is an example of some affected code:Fix a regression with the parsing of source phase imports
The change in the previous release to parse source phase imports failed to properly handle the following cases:
Parsing for these cases should now be fixed. The first case was incorrectly treated as a syntax error because esbuild was expecting the second case. And the last case was previously allowed but is now forbidden. TypeScript hasn't added this feature yet so it remains to be seen whether the last case will be allowed, but it's safer to disallow it for now. At least Babel doesn't allow the last case when parsing TypeScript, and Babel was involved with the source phase import specification.
v0.25.7
Compare Source
Parse and print JavaScript imports with an explicit phase (#4238)
This release adds basic syntax support for the
defer
andsource
import phases in JavaScript:defer
This is a stage 3 proposal for an upcoming JavaScript feature that will provide one way to eagerly load but lazily initialize imported modules. The imported module is automatically initialized on first use. Support for this syntax will also be part of the upcoming release of TypeScript 5.9. The syntax looks like this:
Note that this feature deliberately cannot be used with the syntax
import defer foo from "<specifier>"
orimport defer { foo } from "<specifier>"
.source
This is a stage 3 proposal for an upcoming JavaScript feature that will provide another way to eagerly load but lazily initialize imported modules. The imported module is returned in an uninitialized state. Support for this syntax may or may not be a part of TypeScript 5.9 (see this issue for details). The syntax looks like this:
Note that this feature deliberately cannot be used with the syntax
import defer * as foo from "<specifier>"
orimport defer { foo } from "<specifier>"
.This change only adds support for this syntax. These imports cannot currently be bundled by esbuild. To use these new features with esbuild's bundler, the imported paths must be external to the bundle and the output format must be set to
esm
.Support optionally emitting absolute paths instead of relative paths (#338, #2082, #3023)
This release introduces the
--abs-paths=
feature which takes a comma-separated list of situations where esbuild should use absolute paths instead of relative paths. There are currently three supported situations:code
(comments and string literals),log
(log message text and location info), andmetafile
(the JSON build metadata).Using absolute paths instead of relative paths is not the default behavior because it means that the build results are no longer machine-independent (which means builds are no longer reproducible). Absolute paths can be useful when used with certain terminal emulators that allow you to click on absolute paths in the terminal text and/or when esbuild is being automatically invoked from several different directories within the same script.
Fix a TypeScript parsing edge case (#4241)
This release fixes an edge case with parsing an arrow function in TypeScript with a return type that's in the middle of a
?:
ternary operator. For example:The
:
token in the value assigned tox
pairs with the?
token, so it's not the start of a return type annotation. However, the first:
token in the value assigned toy
is the start of a return type annotation because after parsing the arrow function body, it turns out there's another:
token that can be used to pair with the?
token. This case is notable as it's the first TypeScript edge case that esbuild has needed a backtracking parser to parse. It has been addressed by a quick hack (cloning the whole parser) as it's a rare edge case and esbuild doesn't otherwise need a backtracking parser. Hopefully this is sufficient and doesn't cause any issues.Inline small constant strings when minifying
Previously esbuild's minifier didn't inline string constants because strings can be arbitrarily long, and this isn't necessarily a size win if the string is used more than once. Starting with this release, esbuild will now inline string constants when the length of the string is three code units or less. For example:
Note that esbuild's constant inlining only happens in very restrictive scenarios to avoid issues with TDZ handling. This change doesn't change when esbuild's constant inlining happens. It only expands the scope of it to include certain string literals in addition to numeric and boolean literals.
Configuration
📅 Schedule: Branch creation - Tuesday through Thursday ( * * * * 2-4 ) (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.