Problems with Current State

Publishing modules to the update center currently suffers from a number of problems. Some of these problems do not exist for modules in main (or contrib) in a defined cluster configuration (, but that just increases the pressure to bloat these repositories with modules which would not otherwise need to be versioned in concert with the "standard" IDE distribution. (For example, after ExtractingStandaloneCluster it is suddenly much harder to offer the resulting module suite to users.)

Multiple update site URLs

You can manually upload plugins to certain official UCs, or to the Plugin Portal. But this is very cumbersome when you are actively developing a large group of modules. Using CI is clearly desirable here.

The most obvious problem with publishing update sites from Hudson jobs is that the user then needs to add your update center. So now besides the user needs to know about and and so on. And if these jobs are moved to a different builder, users have to adjust their Plugin Manager settings. This is not acceptable; as a user you expect to have your IDE come preconfigured with a small list of URLs to well-known UCs (e.g. Stable, Contributed, Non-Free) which you can enable or disable according to your tastes.

Ill-defined "versions" of modules released from CI

Radim Kubacki in the context of NB-Android had mentioned being irritated that $job/lastStableBuild/artifact/build/updates/updates.xml does not work quite like you would expect: if you change a module between builds 1 and 2 but do not increment the OpenIDE-Module-Specification-Version, someone who connected to the UC after build 1 had completed would not receive the changes in build 2, but someone who connected for the first time after build 2 would get those changes... so you have two different modules in the field both with the same spec version, which is potentially confusing. You would either (a) want the UC to only offer the copy of the module from build 1 unless and until a new version was explicitly pushed, or you would (b) want the spec version in the AU descriptor to be incremented when a new build is made in which that module was changed.

In I started to play with a Hudson plugin which would implement policy (a), so that rather than listing build/updates/ in the includes for the artifact publisher, you would tell this special plugin to publish an UC $job/nbuc/updates.xml which would get "updates" from builds just like the regular AU client would, so end users would only ever see a single copy of a given module in a given spec version. Policy (b) could also be offered as an option, though it could be tricky to identify "real" changes in a module - the plugin might (b1) compare binary NBMs if it knew how to exclude irrelevant details such as OpenIDE-Module-Build-Number, assuming the build does not insert extraneous timestamps elsewhere; or (b2) it could try to figure out from the build's changelog which NB modules were affected, though that might not be very reliable.

Although this issue could be addressed using a Hudson plugin, or by publishing modules to a Maven repository set to releases-only mode without using CI, an UC aggregator (see below) would also handle either (a) or (b1) with less burden on the module developers.

The postscript to #196428 suggests another way to implement this functionality in a CI build. This would be cumbersome to use, however: the build would need to first generate its UC from build artifacts normally; then make a local copy of the UC from lastStableBuild (either using $JOB_URL or Copy Artifact Plugin); then run <autoupdate> from the 7.1+ Ant-based harness to create a mirror of the last UC, "update" from the current UC, and publish the resulting mirror as an artifact.

Lack of validation

While modules in cluster.config=experimental are checked for a variety of generic problems, no such check is made for modules published to other update centers. For example, user logs often show the JasperSoft iReports module throwing dozens of exceptions about nonexistent layer file entries. While it might be possible to run some of these tests as part of the various module build harnesses, others really only make sense in the context of other modules. Specific checks that would be useful to run against binaries on an UC include:


testInvisibleModules checks that non-autoload, non-eager modules with no incoming dependencies are visible in Plugin Manager. Otherwise they would be "orphaned" on the UC - you could publish them but PM would never offer them to a user.

testPluginDisplay checks that plugins have a display category, since it is common for this to be forgotten. Of course other checks along these lines could be added.

testAutomaticDependenciesUnused verifies that config/ModuleAutoDeps/*.xml upgrade rules are not being activated. (Whether this is appropriate for an UC depends on how it is configured. If there is a well-defined baseline NB version - say, 6.7 - and all modules on this UC are expected to have been built with 6.7 as a target, then it would be bad for a module to be triggering a upgrade rule. If however NB 7.0 is supported as a target for dependencies, but it is permitted to upload modules built for 6.7+, then it is no error for upgrade rules added in 6.8, 6.9, or 7.0 to be activated - such as the split of org.openide.util.lookup from org.openide.util in 6.9.)

deprecatedModulesAreDisabled checks that entire API modules marked as deprecated are not in use. (Again the appropriateness of this check depends on UC configuration - if a module built for 6.8+ is using org.jdesktop.layout, that is no error even if 7.0 is supported and 6.9 deprecated this module.)


This test has two phases, synchronic and diachronic. Its main purpose is to check that all available modules could be enabled simultaneously according to the module system, i.e. that there are no dependencies on the wrong version of some API. Excepting some special circumstances such as module variants for different platforms, this would not normally fail in synchronic mode since the build harness would prevent bad dependencies earlier.

But diachronic mode is interesting: it starts with a simulated NB installation containing all modules present in the last published snapshot of the UC; then it simulates accepting all updates (i.e. modules with a new OpenIDE-Module-Specification-Version) from the proposed new UC snapshot. If the resulting mixture of old and new modules cannot be fully enabled, the test fails. This catches a variety of common developer errors - for example, publishing an incompatible update to an API module (/1 -> /2) without also pushing new versions of all client modules which accept the new version (/2 if they needed to be fixed, or /1-2 if they can work unmodified after the change).

(VUC can also be made to check that all autoload modules are in fact used by something. This is currently unused since sometimes it is intentional to publish an API on an UC so it could be used by other modules though there are no currently published users. The validity of this use case is questionable. shows mostly test libraries - needed on an UC purely as an artifact of the Ant-based build system - and deprecated APIs. There are a few other special cases such as org.netbeans.core.osgi which is not loaded at runtime by the Platform normally but which needs to be available for people trying to load NetBeans in an OSGi container.)


This checks that all classes in the module can in fact be loaded and linked by a module class loader. That is not very interesting for the main module JAR in a synchronic setting, since javac would not have emitted unlinkable code. There are two ways in which this might be interesting:

  • Class-Path extensions (modules/ext/*.jar) might contain unintentionally unlinkable classes. Not clear that this is worth checking, since there are so many cases where sloppily packaged third-party libraries include certain classes which depend on other libraries not bundled in the module (log4j, ...) but which the module never in fact attempts to load at runtime anyway.
  • You would want to know that a module built against NB 6.9 no longer links against 7.0 and so should not be permitted on a 7.0 UC. While incompatible changes to official API modules are rare, and should always be accompanied by a new major release version, this policy is not so consistently followed for less prominent APIs, and modules using implementation dependencies will frequently find themselves broken in this way. Additionally, as with VerifyUpdateCenter, it would be useful to check mechanically that when one module in a suite is changed incompatibly, updated versions of its "friends" in the same suite are also pushed - otherwise the user will get an incomplete update that throws numerous errors after a restart.


This has a lot of different tests all relating to mistakes made in layer.xml files (occasionally also applicable to generated-layer.xml from annotations). Some are simple checks that the module in isolation is well-formed; for example, a <file url="..."> points to an actual resource. Others only make sense in the context of a large set of modules; for example, it is desirable to ensure that all items in a given menu have requested distinct positions, since otherwise a user installing a number of modules may see these UI elements in an arbitrary order.

Difficulty of signing NBMs

It is recommended that all published NBMs be signed. modules (in main & contrib) are automatically signed with the official NB key, which may be inappropriate for a scarcely maintained plugin - the NB organization can affirm that the module was built from sources in a repository it maintains, but not much more.

For modules built elsewhere, signing is not so easy. First you need to generate a key. Then the keystore and storepass cannot be kept in version control, so if you build using CI, you need to pass these in as a special build step. (The Hudson Build Secret plugin makes this possible, but still extra work.)

Aggregator Proposal

TODO: compare to:

There should be a single plugin aggregator (probably part of which publishes multiple update sites, giving both base NetBeans version compatibility and some indication of stability. As a very loose example:

  • 6.9 Stable
  • 6.9 Beta
  • 6.9 Alpha
  • 6.9 Community
  • 7.0 Official
  • 7.0 Contributed
  • 7.0 Non-Free

Any logged-in user could define a "plugin" (set of one or more modules), upload modules forming this plugin, and request that it be included in one or more of the listed update sites. Inclusion in a given site would presuppose that the plugin passes all automated checks. Inclusion in certain official sites would also require approval by an administrator, verification by authorized community members, etc.

Upload process

A user should be able to upload a *.nbm, or *.zip of them, manually through the web interface. This is the only option in the current Plugin Portal.

Mechanical upload e.g. via a web service may be useful.

The user should be able to submit the URL of a valid update descriptor such as [1]. In this case the aggregator should poll the given URL periodically (daily?) looking for new NBMs and automatically publish them.

Version-aware push

As mentioned above, it is unacceptable to publish different copies of an NBM at different times with the same OpenIDE-Module-Specification-Version. So the aggregator should simply ignore any "new" NBMs which specify the same version as the currently published NBM (or which has an older version number).

For a URL-based poll, this would be the normal case, so there should just be some notification to the user when new NBMs are accepted. For a push upload (especially from the web UI), a warning should be issued if some NBMs are submitted which are not pushed.

The aggregator could also include an option (selectable on a per-plugin basis) to use policy (b2) above, i.e. automatically append a final component (.1, .2, ...) to the spec version of a module given in the update descriptor (not the actual NBM or JAR manifest) whenever the NBM seems to have been materially changed since the last published version. A somewhat simpler intermediate policy would be to just issue a warning to the plugin owner in case there appear to be material changes with no matching version increment; in this case the owner can decide whether that is intentional or whether the changes deserve to be published.

If a new plugin submission includes a component module not previously published, clearly that should be included; as a special case, no history check is necessary for the first submission of a given plugin. TBD what should happen if a new submission omits a module which was previously published. Perhaps the old module should simply be dropped from the UC; users who had previously downloaded that module will still have it, and it may or may not be loadable (or otherwise functional) after an update of the other modules in the plugin. There is no standard way to instruct the Plugin Manager to delete an obsolete module, but perhaps the aggregator can push a dummy "update" to this orphaned module which would be marked autoload (thus not loaded), deprecated, have no contents or dependencies, and serve only to remove all real content from the previous version when installed. Entire plugins which are removed from an UC probably require no special action.

Automated checks

All automated tests mentioned above should be run against submitted modules which are registered as updates. If there are any test failures, the whole submission should be rejected and a message sent to the plugin owner giving details of the failures. If there are no failures, the updated modules should be made live on the activated UCs (subject to any authorization as below).

Diachronic checks

In particular, running module enablement and class linkage checks on submissions would be very helpful for developers wishing to offer a single build of a plugin for multiple NetBeans releases. Typically the Hudson job building the UC would be set up to compile against a particular NB release. But plenty of little modules like contrib/insertunicode are compatible with a very broad range of NB releases, so you could probably get away with building against trunk, or some older or newer release than what some of your users are using, if PP was able to perform a basic sanity check of your modules (such as ValidateModulesTest and ValidateClassLinkageTest mentioned above). It would be great if PP would automatically perform load/link checks of your module suite against all registered NB releases, and not offer it for releases where it seems to fail. This is not a substitute for the manual verification feature that PP already offers, but would catch major problems early.

(If you need to release against multiple versions of NB across an incompatible API change, this is trickier. It is possible when using Maven to build certain modules in a reactor against a different version of NB APIs than others; you can factor out a bridge interface into a common module which OpenIDE-Module-Needs some token, then provide two autoload impl modules each of which -Provides it. Plugin Manager ought to load just the correct impl quietly - though #197914 says this is broken - and any check akin to ValidateModulesTest needs to accept that.)


Any unsigned NBMs found in a submission should be automatically signed by the aggregator. There could be a single "Contributed Module" key which would serve just to verify that the module was in fact submitted through the official plugin portal, and thus not injected by some man-in-the-middle attack, but this is pretty weak.

Better would be to generate a key for each registered user (perhaps allowing users to upload their own existing key), and to sign any NBMs in a plugin with the key of the plugin owner. This would allow users to place more trust in NBMs coming from well-known developers. TBD how certificate authorities would work in this case - does the IDE need to include a special authority for the plugin portal, permitting it to create its own certificates?

UC-specific authorization

TBD - voting systems, moderators, staging mirrors, etc.

Not logged in. Log in, Register

By use of this website, you agree to the NetBeans Policies and Terms of Use. © 2012, Oracle Corporation and/or its affiliates. Sponsored by Oracle logo