Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do we update a deep cloned entity?

Describtion

Im working on a little java game server... in order to update and save the game in another thread, im forced to deep clone some of my entities. Otherwhise a internal hibernate exception occurs : "ConcurrentModificationException" when updating my entities

So my flow currently looks like this :

  • Mark game entities to update
  • Pass those entities into another thread
  • Clone those entities
  • Call "session.update" on the cloned entities
  • Repeat in one minute

It works fine with simple classes, but theres a huge problem with relations.

The problem

When i deep clone my "chunk" entity ( see below ), its collection ( inChunk ) also gets deep cloned. I use that deep cloned entity and pass it to "session.update".

During the update, the chunk ALWAYS inserts its collection childs. It never updates them. Because i repeat this update process every minute ( see above ), it results in a "Duplicate Entry" exception at the second update cycle.


// Run the database operation for updating the entities async in a new thread, return updated entities once done
        return CompletableFuture.runAsync(() -> {

            var session = database.openSession();
            session.beginTransaction();

            try {

                // Save entities
                for (var entity: entities)
                    session.update(entity);

                session.flush();
                session.clear();

                session.getTransaction().commit();
            } catch (Exception e){

                var messageComposer = new ExceptionMessageComposer(e);
                GameExtension.getInstance().trace("Update : "+messageComposer.toString());
                session.getTransaction().rollback();
            }

            session.close();
        }).thenApply(v -> entities);

@Entity
@Table(name = "chunk", uniqueConstraints = {@UniqueConstraint(columnNames={"x", "y"})}, indexes = {@Index(columnList = "x,y")})
@Access(value = AccessType.FIELD)
@SelectBeforeUpdate(false)
public class Chunk extends HibernateComponent{

    public int x;
    public int y;
    public Date createdOn;

    @OneToMany(fetch = FetchType.EAGER)
    @JoinTable(name = "chunk_identity", joinColumns = @JoinColumn(name = "identity_id"), inverseJoinColumns = @JoinColumn(name = "id"), inverseForeignKey = @ForeignKey(ConstraintMode.NO_CONSTRAINT))
    @Fetch(FetchMode.JOIN)
    @BatchSize(size = 50)
    public Set<Identity> inChunk = new LinkedHashSet<>();

    @Transient
    public Set<ChunkLoader> loadedBy = new LinkedHashSet<>();

    public Chunk() {}
    public Chunk(int x, int y, Date createdOn) {
        this.x = x;
        this.y = y;
        this.createdOn = createdOn;
    }
}


/**
 * Represents a ID of a {@link com.artemis.Entity} which is unique for each entity and mostly the database id
 */
@Entity
@Table(name = "identity")
@Access(AccessType.FIELD)
@SQLInsert(sql = "insert into identity(tag, typeID, id) values(?,?,?) ON DUPLICATE KEY UPDATE id = VALUES(id), tag = values(tag), typeID = values(typeID)")
@SelectBeforeUpdate(value = false)
public class Identity extends Component {

    @Id public long id;
    public String tag;
    public String typeID;

    public Identity() {}
    public Identity(long id, String tag, String typeID) {
        this.id = id;
        this.tag = tag;
        this.typeID = typeID;
    }

}

Question

Why the hell does hibernate always inserts my childs without any checking if those had already been inserted ? And what can i do to prevent/fix this ?

Things i have tried

  • Removing "@SelectBeforeUpdate" annotations
  • Adding cascading to the "Chunk.InChunk" relation with "All" or "Merge"
  • Run a "session.merge" instead of "session.update" which results in the same duplicate entry exception, simply ignoring the fact that its children are already inserted.

None of them worked

Disclaimer

I need to clone the entities, otherwhise it results in an internal hibernate exception, see the link above.

Im using a library called "DeppClone" https://github.com/kostaskougios/cloning for cloning my entities in an other thread.

If more informations are required, please write a comment. Its a complex issue and hard to generalize, but i hope that i have described it properly.

like image 522
genaray Avatar asked Nov 06 '22 04:11

genaray


1 Answers

You should consider to achieve two major changes:

  1. separation of concerns (read & write)
  2. choose a consistency principle

With #1 you don't have to deal with ConcurrentModificationException anymore. There is only one component or service involved modifying your entities.

A second component or service only reads the entities in a separate context. All entities must be immutable to stay away from implementation accidents.

Both need a contract or interface (in the general meaning, not a Java interface) which brings you to #2. As soon as a change happened any speed-up trickery in your read-only-context must flush caches or re-read/ merge changes. And according to CAP-theorem you have to sacrifice one of the three, depending on your preferred consistency strategy. Since there is no hint about the number of writes, entities and read-constraints I cannot suggest anything more detailed.

If I would have to implement it now there'd be at least 3 modules (Java 11/ Jigsaw):

  • API-module holding only interfaces to force for example unified get-methods across the following two modules
  • Writer-modules, with all your Hibernate magic to write entities and a sort of listener-pattern, so the next module can register itself
  • Reader-module with Hibernate magic to read entities and provide them to others (over REST, RCP, …), registering itself to the writer to have some sort of refresh upon changes.

The reader-module could also read the database upon start and consume events produced by the writer. These events would be sort of commands, changing in memory in the reader-module instead of re-reading from the database. So you could drop Hibernate completely through plain in-memory caching with a bit of event-modeling. This would still work in a single JVM by using BlockingQueues and ConcurrentHashMap (as cache). Simple JDBC would be enough to bootstrap your model.

If it were that easy by deep copying and a bit of Thread.start others would do it this way. Since there are plenty of models, strategies and patterns out there regarding concurrent persistence I suggest to refactor.

like image 147
motzmann Avatar answered Nov 11 '22 04:11

motzmann