-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dependency Extraction Webpack Plugin impaired DevX #35630
Comments
L-2 policyL-2/L-n policy means that the plugin should support only the latest It's to be implemented on the plugin level only. I think it's a nice policy to start with, but we should not stop here, as we still face all the DevX frustration even with |
Build/automate WP, WC, plugin version mapsI think that all the data is there (somewhere) we just need a nice and automated way to gather it all in a single place. As mentioned in the comment by @jsnajdr
I’d only add to make it machine and human-readable, and for WooCommerce as well. Conceptually it’s gathering all Thanks to that a developer, support engineer, merchant, further tools could inspect what versions are used where and decide upon that.
It solves 3, 5, 6; helps with 2, 4. I think it's something we should start with, as it would give us an insight and data to reason about the problem. |
Make Dependency Extraction Plugin handle dependency versionsI believe we have all the data in place to solve the problem, assuming we already have a way to get granular dependencies’ versions for a given WordPress version. The local I’m not very experienced with Webpack. But to me, the feature that DEWP brings is similar to what native Import maps do: “Check what dependencies are already available in WP/WC and map those imports to external modules” (instead of adding and looking them up in the local bundle). What’s cool about native import maps (besides the fact, it’s native, already available for free in Chromium, and does not require us to complicate our tools and stack) is that they seem to solve the problem of multiple versions problem – with scopes. So if someday we’ll switch to native ESM, we could use a single map for all plugins, without a need for each plugin adding DEWP to their tool stack. I’ll use import maps syntax as I believe it’s clear and declarative enough to express the behavior we’d like to have: Consider the given WooCommerce version uses "@woocommerce/components": "5.1.2",
"@woocommerce/currency": "3.1.0",
"@woocommerce/number": "1.2.3", Then the import map for a plugin that would like to use shared dependencies would look like "imports": {
"@woocommerce/components": "/wp-content/…/woocommerce-admin/…/components.js",
"@woocommerce/currency": "/wp-content/…/woocommerce-admin/…/currency.js",
"@woocommerce/number": "/wp-content/…/woocommerce-admin/…/number.js",
}, That’s what we have today. And AFAIK, that’s where DEWP functionality ends in terms of versions. But hopefully, we can add a bit of checking logic there. OverlapIf our plugin uses "@woocommerce/components": "^5.1.1",
"@woocommerce/number": ">=1.1.1 <1.2.0",
"fast-json-patch": "^3.0.0", Then all the shared dependencies are there, so the bundle will include only Newer versionIf our plugin uses "@woocommerce/components": "^5.1.1",
"@woocommerce/number": "^2.0.0",
"fast-json-patch": "^3.0.0",
"imports": {
"@woocommerce/components": "/wp-content/…/woocommerce-admin/…/components.js",
"@woocommerce/currency": "/wp-content/…/woocommerce-admin/…/currency.js",
"@woocommerce/number": "/wp-content/…/woocommerce-admin/…/number.js",
},
"scopes": {
"/wp-content/plugins/gla/": {
"@woocommerce/number": "/wp-content/…/gla/…/number.js",
},
}, (So the imports originating from Native import maps are resolved on the run-time, so they could use the WP/WC dependency maps from the currently running setup. It solves 1, 2, 4, 5, 7, 8. |
Add runtime import map shim/resolverThis one could be the trickiest to implement, but if a given WordPress version knows its own granular dependency versions and receive the plugin's map, then theoretically it knows whether the versions match. If so it would do what it does today, and return the import from the WP bundle. If not could point back to the plugins bundle. So the "extended" bundle would be requested only if the currently running WP environment does not have a matching dependency. However, I don't know enough on how WP handles all the scripts, to propose something more precise, or judge how doable it is. It may solve all the problems, but would require a lot of changes across the stack. |
Just spitballing some ideas: Sounds like Webpack Module Federation could help here. The official documentation says
If we replace Maybe we can replace DEWP with Webpack DLLs. Maybe we can publish NPM packages with the DLLs for specific WP versions (eg: This package could also provide a set of So MyPlugin depends on Although that would mean I need to maintain several "lines" of MyPlugin: one compatible with |
I also have one thing that I think causes a lot of pain points when it comes to working with the DEWP. And that is the inability to tree shake. If you use a singular function from Regarding your comments, I actually like the idea of a runtime import map shim/resolver quite a lot. Core currently doesn't ship with multiple versions of the packages and I'm not sure whether it would be wise to do so. Of course, you could always bundle all your dependencies with your plugin but as you mentioned that leads to the same code being imported by multiple plugins and therefore a lot of overhead. So the idea of only loading a bundle if the version isn't a match is very intriguing. I also have no clue however how feasible it would actually be to implement something like it. |
There are also some related issues to WordPress dependencies and their relation with npm packages published:
In practice, it is more complex because sites with the Gutenberg plugin installed will have different versions of the same script dependencies. In the case of the Gutenberg plugin, it not only changes every two weeks, but it is also acceptable to remove experimental and unstable APIs after 3 plugin releases (a few weeks). It's discouraged to use experimental/unstable APIs but we don't have control over what plugins use. The other challenge is that it isn't mandatory to publish to npm the version of WP packages that the Gutenberg plugin uses. In addition to that, during the WP major release cycle, we only cherry-pick bug fixes from Gutenberg releases and publish them to npm. Whatever this discussion lands on, it might only work with WordPress core, but is close to impossible to apply the same techniques based on the npm packages to sites using the Gutenberg plugin. |
@fabiankaegy If the Tree shaking is available only for packages that are bundled into the compiled JS. That's the opposite of externalization. So, if you really want to bundle the one or two Lodash functions that your script is using, the solution is to opt-out from |
@jsnajdr Yeah I do understand the technical reason for it :) I only wanted to raise it here because I believe that since it happens behind the scenes it is not as obvious while you are developing and therefore a pitfall that you can very easily fall into. And so either we can take a look at better ways of reporting the impact that your externalized imports will have on the end-user, or find a technical solution that would allow for more granular imports (I know this would be a very tricky and maybe impossible goal :)) So maybe it just is an addition to the CLI output that you get to see with the file size of an estimate of the externalized packages bundle size or some sort of reporting :) |
This kind of reporting is most useful if the developer can do anything about the reported issues, but here I'm afraid they can't do anything. It's a shortcoming of the WordPress platform. So, my plugin uses the
The only viable solution is to create small modular packages instead of big monolithic ones. I'm afraid that anything else, like true tree shaking, is at odds with having a plugin architecture and modularity, where multiple blocks and plugins live together on the same page and share stuff. |
In my opinion, it even adds more importance to this issue. As if the README of DEWP would state:
Then I as a plugin developer would do everything to avoid it as much as I can. To be able to assert not only the quality of my product, security, and integrity of the data.
Having it solved just for WordPress Core, is already a step forward. I agree that comparing npm packages when there are no npm packages doesn't make sense. So "it's impossible to solve it by npm packages techniques only". That's why I proposed to use import maps that do not involve NPM at all. Plus, I hope that the way Gutenberg processes a release and the way it uses/publishes the packages is not something set in stone, but something we can still discuss and potentially improve. If we agree it is suboptimal and impairs the quality of the ecosystem. I believe it's not an impossible thing to solve. I understand that with the way WordPress platform works, many plugins may load/overwrite scripts. But that still does not block us from making a solution that would allow a plugin to get some assurance over its dependencies. We know the current DEWP as it is, does not solve that. Now we need to find out whether we can improve it or need something more. WordPress is not a unique platform when it comes to the problem of delivering a set of shared dependencies and allowing the individual parties (plugins) to add more and overwrite some. |
I think that kind of reporting still have some value ;) having it clearly stated, that "Hey, by using DEWP, your Speaking of tree shaking, I think it's another problem to solve, on a deeper level of complexity and ROI. So far we are struggling with a solution to de-duplicate packages, package-shake. Tree-shaking of what's delivered on WP and what my plugin needs, was out of my scope. I have a gut feeling that first, we need to have assurance that the plugin we actually import is the package we expected to import. Before we start to shake the unwanted bits from those packages. However, I see that while improving import management, we could provide more insights like
So the developer themself could decide whether it's worth taking the risk of externalizing a big library if it could be tree-shaken to something small enough to be bundled locally. To tackle tree-shaking of dependencies already delivered by the platform according to plugins' usage, would probably require a build phase after the plugin is activated, which I believe is a major architectural shift in the ecosystem. |
Could you elaborate on that? Does that mean that if I use |
AFAIK, yes. At least for WooCommerce woocommerce/woocommerce-admin#7628 But I'd appreciate a more elaborate explanation too. I'm curious what's the rationale behind it, and what value it brings? From a plugin developer perspective, I see a lot of downsides. |
The webpack plugin (DEWP) doesn't replace import statements with different versions of the package in the build process controlled with webpack. Instead, it replaces import statements with references to import { ComponentA } from '@wordpress/components'; becomes something close to: const { ComponentA } = wp.components; So as long as the same public API is present in WordPress core through the scripts registered at return ComponentA ? <ComponentA /> : null; The same strategy is used in PHP code that plugins write for WordPress. |
This really stood out to me. I'm inferring from your statement that you have some examples of other platforms in mind (on a similar scale as WordPress/Gutenberg) that have wrestled with this problem. If so, what do these other platforms do to solve the problem? Are there things we could learn from those examples? |
I sense there is some deep misunderstanding about the purpose of DEWP and what it means to "import When using it in a WordPress plugin, the The meaning of the import is that the code is running inside some environment that already contains the The import means something like a It's similar to doing a native
The plugin developer doesn't really have that choice, to bundle the platform packages. Just like they don't bundle the native Declaring
One additional problem is the documentation of the DEW plugin? The name describes more its internal workings rather than the actual value it provides, and the README is not particulary lucid either. So, I think working on these three problems could help us move forward? |
@jsnajdr, thank you for a more detailed explanation of my previous comment #35630 (comment). I love the reference to Node.js API. This is exactly how we should think about it.
It's also helpful for linting or hints in IDEs. When using the build tools with DEWP configured to handle all default externals, from the production code perspective, you don't need those packages to be installed in your project. |
I totally understand the package will be provided by the host environment in production. I think that point is quite clear. My original question is if I can pull in the same dependency for my dev environment. There are many many benefits to it, some of them already mentioned (unit testing, typechecking, IDE hints, linting...) I assumed that the answer was "yes", that there was a NPM package However, this comment (and this issue) made me doubt it:
I'm not sure I fully understand the comment, but seems to suggest that |
It's possible when working with WordPress core to match the same list of package dependencies that get externalized. However, the applicability is limited to the cases that we covered with @jsnajdr.
The block editor in WordPress core gets updated through npm packages so the publishing is tight to that. However, for the Gutenberg plugin, all the source is there so there is no need to publish to npm whenever a new version of the plugin gets released to the WordPress plugin directory. On principle, we can't match npm releases with Gutenberg plugin releases. Sometimes, we could, but it is completely unreliable – in particular during the beta/rc release cycle for WordPress major release. |
That would mean that the version of
Is there anything stopping us to also publish each package to npm at that point? |
I also wanted to clarify that the plugin developer doesn't really have that much of a choice whether they want to bundle a
If this really happens and it happens often enough, it's just a matter of improving our NPM release discipline, isn't it? I don't know how often @gziolo and others publish packages to NPM and how is that synchronized with Gutenberg releases. In principle we can publish a matching set of NPM packages with every release, and have perfect 1:1 version mapping between them.
Yes, that's exactly how the Gutenberg plugin works. WP 1.0.0 has a certain version of Gutenberg built-in (you can think of it as a LTS release) and installing the Gutenberg plugin overrides it completely, including all the Then there's also the very complex WooCommerce plugin that uses the Core packages (potentially overridden by Gutenberg) and exposes its own set of packages that are available to plugins that extend Woo. |
My experience so far was with mostly native JS and HTML/ Web Components world, but I believe that the principle of the problem is the same regardless of the content of the dependencies. And we don't have to limit ourselves, to a readymade product in a form of a WebPack plugin. The case we have is: If we agree, that this is the problem we are talking about. Then yes, I did work myself implementing a similar platform. It didn't reach the scale of WordPress when I was there. But I can name a few other examples of IMHO big enough scale.
We are writing in JS and HTML and both languages give us the primitives to import dependencies, to create scopes, to run the code within those scopes, or to overwrite higher scopes. They give us the primitives to control the scope as well as to invert this control. So, in the way I see the problem, it's not the hard technical limitations, but what we do, how we do, and what we communicate back to plugin developers like me, or to automated tools. |
I think you're right. Probably, there is. Initially, I thought DEWP is a convenient plugin to optionally use to reduce the bundle size.
I read it as "If WP happen to have your dependency available, it will remove it from the bundle and use them instead". To me
TBH, this one is still pretty unclear to me. But I read it as "It will create a file that will list the WP script dependencies, whatever >script dependency< is". I don't know why I'd need it but still does look hazardous. The "calculated version" I totally don't get what it is and why it would even matter to me. Is it calculated for the source code of my plugin, of its deps, of "WP script deps"? For sure it's not the version I was looking for, that matches anyhow with my plugin dependencies.
Cool, so it will reduce the chance for error, not increase it, right? Especially when it comes to me manually maintaining the version I run on my development/test environment. 🤔 From what you stated right here I was really far from the truth.
Then to me than having Maybe, we should really recommend using
To me, the difference is that when I run the old version of Node I can check what version of the API is there. The problem stated in the OP is that with WordPress, even though I know the (test/dev/customer) environment is running WP version 5.8.1, and Gutenberg 1.2.3, tracking down which version of
That's something really new to me, as I stated above. For a year of developing a plugin that uses
And to me, that's the problem I think should be addressed. Either on one end by giving
👍👍 I think I already expressed in this comment, how confused I get with the current docs :) I'd love to find the value and reason in the readme. I like to read in the docs why I need it, what it does to me/my project, rather than what it does to code/how it does that. |
Is there a reason/value for not releasing NPM packages when releasing Gutenberg? Or was it simply no need to put an effort to sync those? I hope the comments of @scinos and me here, give at least some reason to consider it in the future. |
Yeah, compatibility is definitely a big issue with the current approach. There is clearly an expectation (from JS developers) that one can rely on Developers, I think, are approaching As a result, I think the way we specify which packages are provided is not ideal. We want developers to clearly understand they need to support a version range. (And even more specifically, a range of WordPress or Gutenberg plugin versions, not a range of npm package versions.) But the way DEWP interacts with npm/package.json makes that tricky/impossible. Relatedly, it would be helpful to know at bundle time if your package is properly incompatible -- such as relying on an import only added in the latest version. Obviously, the inherent design of DEWP makes it hard to solve those problems. But it's still worth thinking about these shortcomings, even if those shortcomings are inherent to the benefits of DEWP. As a side note, the way package versions relate to the gutenberg plugin + WordPress has always been confusing to me, especially if one is trying to use a package outside of the WordPress environment. There will be several periods where you can't get any updates because npm releases are frozen during the wp release cycle (though I've never been sure why that is). |
(@tomalec sorry for the delayed reply, I've been sick for a few days last week)
The trouble with this statement is that it's not completely false 🙂 When working with a dependency like Then there are other packages like
It's all very similar to Windows DLL and their static or dynamic linking. And the "DLL hell" phenomenon associated with that 🙂
The wp_enqueue_script( 'my-script', 'dist/app.js', array( 'wp-data' ), 'e9f0118ee9' ); And the last two arguments come directly from the In other words, it's an internal file that's used by the WordPress script loading system to do all the plumbing correctly. Similarly, Windows DLLs can have companion XML files called "assembly manifests" that tell Windows a lot of details about how to work with that DLL. Again, this confirms that the DEWP's README jumps to fast into explaining the internals.
I still don't understand the For the externalized reference to work, some plugin must call: wp_register_script( 'wp-settings', 'dist/build.js' ); and that window.wc.wcSettings = { ... }; Is the issue that some part of that is not happening or that some part of the process is confusing? |
…ion of WordPress. This is to mitigate the problems related to DEWP usage mentioned in WordPress#35630. This allows a plugin developer to install to-be-extracted dependencies at the lowest version of their plugin targets. To be able to run local tests and linters against that version. To give some insight to the developer, on what is the lowest anticipated version of an individual package.
We landed the PR #35106 from @tomalec that addes an optional feature to We have now also an issue #36716 for tracking all the work related to adding support for JavaScript Modules and Import Maps in the Gutenberg plugin (and later WordPress Core). It's one of the primary tasks of the newly formed WordPress Performance JavaScript focus group. You can check other initiatives discussed in https://docs.google.com/document/d/1GD0X3bNUa73Afsi8OZjrSDb0VfEgDg-PLlBvw5aL-sc/edit#. It seems like import maps might help to improve the control over dependencies shipped with WordPress Core, but as @jsnajdr pointed out a good versioning strategy might be hard or impossible in some cases. Let's see how it evolves. For the issue with the lack of matching npm releases for every possible Gutenberg plugin release, I'm going to propose a revised strategy for npm publishing that will take that into account. It's a more complex task because we will need to automate npm publishing on CI and link it with the existing Gutenberg plugin releases that are handled by GitHub actions. |
I think it's a pretty accurate description of the source/path that lead (at least me) to those problems. That's why the title of my OP is about impaired DevX. As till this day, I didn't find any documentation page that states
For example
The DEWP package docs should state and emphasize that. Also
I think that confusion also originates in the fact that DEWP still uses npm-looking package names instead of script handles, and leaves no trace in the plugin code that would suggest that's not the regular npm package.
☝️ that IMHO is purely DEWP devX problem, not related to the benefits it aims to deliver or problem it is meant to solve.
I really don't agree that those shortcomings are inherent to the benefits of DEWP. Maybe I miss something. Could we please, for the benefit of this discussion, specify a clear, explicit list of benefits & goals of DEWP? So we would, all know what are the problems to be addressed, and what are just the obstacles that could be removed or addressed otherwise. Having this list would itself address one DevX problem and AFAIK others (@puntope) faced. |
I think one, devX improvement we could have is that it could be explicitly stated, which dependencies are externalized absolutely MUST be externalized not to lead to conflicts, and which simply are externalized to the same traffic. Currently, as plugin developers, we need to maintain for months a local copy of a few If we would know that we MUST NOT bundle
So maybe we could separate those. As a new coming developer, I'd expect
Thanks for that explanation :) |
In Google Listings and Ads plugin we started using
Naturally, instantly our ESlint warned to add But it turned to be a source of many problems, as The above took a toll on many developers' time, bugs, and a lot of confusion. I believe such problems could be avoided by better care when it comes to package specifiers, documenting what are the actual packages that are externalized, and some tooling to help DEWP users (plugin developers) introspect what's happening. |
I started a discussion related to that in #37820. Let's tackle this one separately from this isue. |
Forgive me for the generic title, but I'd like to tackle many interconnected issues related to Dependency Extraction Webpack Plugin (DEWP) and its DevX, as I believe we need a holistic view to solve those.
TL;DR
Plugin developers have no control or even introspection over the extracted dependencies. Not only the versions but also the packages being extracted. Which heavily decreases the quality of our products and the maintenance cost.
I'm trying to gather here the problems and ideas. So, we could refer to this issue from GH issues and PRs with potential solutions (like #35106 (comment)), without losing the overview of the problem.
Context
I'm a WooCommerce(WC) plugin developer, so the perspective below could be WC-oriented. However, I believe the problems stated here are not unique to WooCommerce or my plugin and apply to any other WordPress plugin. WooCommerce adds just another layer of dependencies.
Dependency Extraction Webpack Plugin (DEWP)
AFAIK, the main goal and reason why we use DEWP are to avoid delivering duplicated JS packages from plugins to a single WordPress (WP) site. It reduces the network and CPU cost for users and saves us from a number of problems related to the packages which have a side effect, which may collide when imported/used twice.
I see dependency extraction as a nice way to address that, also I believe that customers' costs and experience should be the top priority when looking for a solution.
But the way our tool works introduces a number of struggles for developers. IMHO, the impact is severve. I noticed a number of people working on different plugins and products discussing their struggles with DEWP. To somewhat evaluate the impact in the plugin I work for, I marked all issues that are affected by this problem (However, it doesn't include everyday frustration)
Where we are at?
Let’s draw a picture of what dependencies and problems we face while developing a plugin.
Naturally, we have (let’s call them “main”) dependencies: WordPress and WooCommerce. We have a strict requirement – driven by the actual merchant/customer expectation – to support at least a few versions behind, not only the latest, as updating the stack is a cost and burden for our merchants. However, given the finite resources we have, we’d rather balance it not to support too many combinations with legacy code. We have a defined process to track updates of those, and a tool (
wc-compat-checker
) to support the PHP side of it. But it's still manual labor, and we do not have much tooling to support the person doing a compatibility check on the JS side.We have “granular” dependencies – individual npm packages stated in
package.json
. There are alsocomposer
dependencies, but I’m not sure if they are also that problematic. I guess, there is no customer-driven requirement for those. I doubt we have customer requests like “Hey, I’d like AutomateWoo to work with my@woocommerce/[email protected]
-based store”. However, we do have some constraint that comes from using many dependencies – they need to be cross-compatible with each other and with the WordPress & WooCommerce being used.Then we have dev dependencies, for which the merchant should not care about at all. Unfortunately, they are also tied to WP/WC versions being in use.
Problems
The above creates three fundamental problems:
@woocommerce/components
version being used for respective WC version.To solve/mitigate the first problem, there are
@wordpress/
and@woocommerce/dependency-extraction-webpack-plugin
. However, the way it works makes the whole development quite indeterministic: which packages are extracted, which main or granular dependency is actually used in runtime? It’s being extracted and used blindly, regardless of versions, without even reporting what was extracted, and what is assured to have a specific version. That created a bunch of other problems:@woocommerce/components
. Unless, I carefully, manually, and deeply curated and extracted all the granular dependency trees across all (minor and patch) versions of main dependencies. Then all I get is still a range of versions.This is very time-consuming labor, that needs to be done very often. When adding a dependency, when updating anything, when debugging an issue, when checking compatibility. Currently, there is no tool to support that or even a specific list of resources to track.
That may result in unexpected behavior and error. Checking that again requires even more manual effort. As it requires digging into the source code of DEWP at a given version and manually comparing packages list.
That makes reproducing the reported problems harder and more time-consuming, eventually affecting customer experience.
package.json
while the one run by the customer is totally different.In a Slack discussion, @nerrad suggested implementing the L-2 policy, and supporting only two versions behind, then supporting the lowest versions available. This naturally limits the ranges and number of combinations but does not tackle the problems themselves. Plus, may fail if any main dependency decreases the version of its dependency, or introduce backward-incompatible change.
Cost of status quo
In my opinion, the above impair not only DevX but also innovation and the quality of our products.
Innovation
Quality
Ways to go
I think when looking for a solution for the problems stated above, we could take a few (non excluding) strategies:
Personally, I'd start from the latter, as it's the cheapest to start with (in time, effort, and chaos it'd generate).
//cc all folks who I noticed discussing related ideas: @ecgan, @roo2, @scinos, @jsnajdr, @fullofcaffeine, @noahtallen, @sirreal, @gziolo, @mcsf, @nerrad
Solutions
I don’t have any precise well-established solution for the above. I’d rather start a discussion about those. But here are few ideas I managed to gather or come to my mind. I put them in separate comments, for easier reference.
The text was updated successfully, but these errors were encountered: