Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No new BOMs are processed, Referential integrity constraint violation #4655

Open
2 tasks done
JustDoItSascha opened this issue Feb 17, 2025 · 8 comments
Open
2 tasks done
Labels
defect Something isn't working pending more information

Comments

@JustDoItSascha
Copy link

Current Behavior

Hello,

everytime we try to upload a new BOM right now, we are seeing this error in the logs:

2025-02-17 13:34:51,983 ERROR [BomUploadProcessingTask] Failed to process BOM [bomUploadToken=***, projectName=frontend, projectUuid=***, projectVersion=release-2-4-1]
javax.jdo.JDODataStoreException: Insert of object "org.dependencytrack.model.ComponentProperty@7055d49e" using statement "INSERT INTO COMPONENT_PROPERTY (COMPONENT_ID,DESCRIPTION,GROUPNAME,PROPERTYNAME,PROPERTYTYPE,PROPERTYVALUE,UUID) VALUES (?,?,?,?,?,?,?)" failed : Referential integrity constraint violation: "COMPONENT_COPY_3_0_COMPONENT_PROPERTY_FK1: PUBLIC.COMPONENT_PROPERTY FOREIGN KEY(COMPONENT_ID) REFERENCES PUBLIC.COMPONENT_COPY_3_0(ID) (CAST(154772 AS BIGINT))"; SQL statement:
INSERT INTO COMPONENT_PROPERTY (COMPONENT_ID,DESCRIPTION,GROUPNAME,PROPERTYNAME,PROPERTYTYPE,PROPERTYVALUE,UUID) VALUES (?,?,?,?,?,?,?) [23506-232]
	at org.datanucleus.api.jdo.JDOAdapter.getJDOExceptionForNucleusException(JDOAdapter.java:608)
	at org.datanucleus.api.jdo.JDOPersistenceManager.flush(JDOPersistenceManager.java:2057)
	at org.dependencytrack.tasks.BomUploadProcessingTask.processComponents(BomUploadProcessingTask.java:460)
	at org.dependencytrack.tasks.BomUploadProcessingTask.lambda$processBom$0(BomUploadProcessingTask.java:287)
	at alpine.persistence.AbstractAlpineQueryManager.lambda$runInTransaction$6(AbstractAlpineQueryManager.java:564)
	at alpine.persistence.Transaction.call(Transaction.java:139)
	at alpine.persistence.AbstractAlpineQueryManager.callInTransaction(AbstractAlpineQueryManager.java:542)
	at alpine.persistence.AbstractAlpineQueryManager.runInTransaction(AbstractAlpineQueryManager.java:563)
	at alpine.persistence.AbstractAlpineQueryManager.runInTransaction(AbstractAlpineQueryManager.java:575)
	at org.dependencytrack.tasks.BomUploadProcessingTask.processBom(BomUploadProcessingTask.java:282)
	at org.dependencytrack.tasks.BomUploadProcessingTask.processEvent(BomUploadProcessingTask.java:181)
	at org.dependencytrack.tasks.BomUploadProcessingTask.inform(BomUploadProcessingTask.java:154)
	at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:110)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: org.h2.jdbc.JdbcSQLIntegrityConstraintViolationException: Referential integrity constraint violation: "COMPONENT_COPY_3_0_COMPONENT_PROPERTY_FK1: PUBLIC.COMPONENT_PROPERTY FOREIGN KEY(COMPONENT_ID) REFERENCES PUBLIC.COMPONENT_COPY_3_0(ID) (CAST(154772 AS BIGINT))"; SQL statement:
INSERT INTO COMPONENT_PROPERTY (COMPONENT_ID,DESCRIPTION,GROUPNAME,PROPERTYNAME,PROPERTYTYPE,PROPERTYVALUE,UUID) VALUES (?,?,?,?,?,?,?) [23506-232]
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:520)
	at org.h2.message.DbException.getJdbcSQLException(DbException.java:489)
	at org.h2.message.DbException.get(DbException.java:223)
	at org.h2.message.DbException.get(DbException.java:199)
	at org.h2.constraint.ConstraintReferential.checkRowOwnTable(ConstraintReferential.java:308)
	at org.h2.constraint.ConstraintReferential.checkRow(ConstraintReferential.java:249)
	at org.h2.table.Table.fireConstraints(Table.java:1227)
	at org.h2.table.Table.fireAfterRow(Table.java:1245)
	at org.h2.command.dml.Insert.insertRows(Insert.java:188)
	at org.h2.command.dml.Insert.update(Insert.java:135)
	at org.h2.command.CommandContainer.executeUpdateWithGeneratedKeys(CommandContainer.java:212)
	at org.h2.command.CommandContainer.update(CommandContainer.java:133)
	at org.h2.command.Command.executeUpdate(Command.java:304)
	at org.h2.command.Command.executeUpdate(Command.java:248)
	at org.h2.jdbc.JdbcPreparedStatement.executeUpdateInternal(JdbcPreparedStatement.java:213)
	at org.h2.jdbc.JdbcPreparedStatement.executeUpdate(JdbcPreparedStatement.java:172)
        at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeUpdate(ProxyPreparedStatement.java:61)
	at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeUpdate(HikariProxyPreparedStatement.java)
	at org.datanucleus.store.rdbms.SQLController.doExecuteStatementUpdate(SQLController.java:465)
	at org.datanucleus.store.rdbms.SQLController.executeStatementUpdateDeferRowCountCheckForBatching(SQLController.java:415)
	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:532)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:235)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:211)
	at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:4614)
	at org.datanucleus.state.StateManagerImpl.flush(StateManagerImpl.java:5848)
	at org.datanucleus.store.types.SCOUtils.validateObjectForWriting(SCOUtils.java:1342)
	at org.datanucleus.store.rdbms.scostore.ElementContainerStore.validateElementForWriting(ElementContainerStore.java:289)
	at org.datanucleus.store.rdbms.scostore.FKListStore.validateElementForWriting(FKListStore.java:1037)
	at org.datanucleus.store.rdbms.scostore.FKListStore.internalAdd(FKListStore.java:551)
	at org.datanucleus.store.rdbms.scostore.AbstractListStore.addAll(AbstractListStore.java:109)
	at org.datanucleus.store.rdbms.mapping.java.CollectionMapping.postInsert(CollectionMapping.java:163)
	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:640)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:235)
	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:211)
	at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:4614)
	at org.datanucleus.state.StateManagerImpl.flush(StateManagerImpl.java:5848)
	at org.datanucleus.flush.FlushOrdered.execute(FlushOrdered.java:96)
	at org.datanucleus.ExecutionContextImpl.flushInternal(ExecutionContextImpl.java:4051)
	at org.datanucleus.ExecutionContextImpl.flush(ExecutionContextImpl.java:3997)
	at org.datanucleus.api.jdo.JDOPersistenceManager.flush(JDOPersistenceManager.java:2040)
	... 14 common frames omitted

Steps to Reproduce

Unfortunately we don't know what to do to reproduce this. It just appeared from one day to another and no project works anymore...

Expected Behavior

that uploading BOM files is possible again...

Dependency-Track Version

4.12.4

Dependency-Track Distribution

Container Image

Database Server

H2

Database Server Version

No response

Browser

Google Chrome

Checklist

@JustDoItSascha JustDoItSascha added defect Something isn't working in triage labels Feb 17, 2025
@nscuro
Copy link
Member

nscuro commented Feb 17, 2025

PUBLIC.COMPONENT_PROPERTY FOREIGN KEY(COMPONENT_ID) REFERENCES PUBLIC.COMPONENT_COPY_3_0(ID)

Did you modify the foreign key constraint to point it to another table? DT will write component data to the COMPONENT table, but your FK references a table called COMPONENT_COPY_3_0. Which of course will violate since new components will never be written to COMPONENT_COPY_3_0.

@JustDoItSascha
Copy link
Author

When you're asking if I modified the SQL tables myself then the answer is no 🙈 I did not touch any SQL table. We did nothing. We just uploaded bom files via the API. It all worked and suddenly it did not work anymore...

@JustDoItSascha
Copy link
Author

@nscuro Any idea what else it could be or how to fix it? I'm just asking because it's unfortunately blocking our release cycle right now :(

@nscuro
Copy link
Member

nscuro commented Feb 18, 2025

All I can tell from some surface-level googling is that the _COPY_ table seems to be created by H2 during migrations(?), and for whatever reason in your case it was not properly cleaned up: https://groups.google.com/g/h2-database/c/EVgdhxEg7XM

These are H2 internals and nothing DT can do anything about. I suspect your instance was recently upgraded, and interrupted / killed while running migrations.

What you could do is use a tool like DBeaver to open the H2 database, find dangling foreign keys as outlined above and similar leftovers, and repair them. If you're lucky, it's only this particular foreign key that's broken.

If you can afford to start fresh, use PostgreSQL instead of H2, since H2 is not recommended for production usage.

@JustDoItSascha
Copy link
Author

Thanks. You mean just deleting the foreign keys should do the trick?

And I wanted to use the Postgres one, but we are using the Helm Chart and I've not seen any option to use postgres...

@nscuro
Copy link
Member

nscuro commented Feb 18, 2025

You mean just deleting the foreign keys should do the trick?

More like recreating it to reference the COMPONENT table instead. Just removing it would potentially open the door for integrity issues later.

but we are using the Helm Chart and I've not seen any option to use postgres...

The chart doesn't bundle Postgres, because Postgres has a different lifecycle than DT. You can deploy Postgres using another chart if you must run everything on Kubernetes. The chart offers the same options to configure a database as the normal Docker Compose deployment.

@JustDoItSascha
Copy link
Author

I will try to convert the H2 database to postgres. But I was wondering how I can change the configuration in the helm chart? There is no explanation in the helm chart config...

@nscuro
Copy link
Member

nscuro commented Feb 20, 2025

Configuration of containers mostly happens via environment variables. All services have an extraEnv value that you can use for this purpose: https://github.com/DependencyTrack/helm-charts/blob/ea9368d75766b563032422d1a1b59dda75f690e3/charts/dependency-track/values.yaml#L62-L69

Due to the vast number of settings, we didn't want to add Helm values for every single one of them, only to set an environment variable behind the scenes anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
defect Something isn't working pending more information
Projects
None yet
Development

No branches or pull requests

2 participants