Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Replacing SchemaExport(Configuration) in Hibernate 5

While migrating from Hibernate 4 to 5 I came across the deprecation and eventual removal of the SchemaExport(Configuration) constructor. What is a good alternative in Hibernate 5?

Use case

During testing we create a SchemaExport instance with a configuration that had some properties set and defines mapping resources:

// somewhere else `Properties` are filled and passed as a parameter
Properties properties = new Properties();
properties.put("hibernate.dialect", "org.hibernate.dialect.HSQLDialect");
properties.put("hibernate.connection.driver_class", "org.hsqldb.jdbc.JDBCDriver");
// more properties ...

Configuration configuration = new Configuration();
configuration.setProperties(testProperties);
// parameter `String... mappingResources`
for (final String mapping : mappingResources) {
    configuration.addResource(mapping);
}

// this doesn't compile
SchemaExport schemaExport = new SchemaExport(configuration);

The last line does not compile on Hibernate 5 because the constructor was removed.

Options

The deprecation recommends to use the SchemaExport(MetadataImplementor) constructor, but I struggle to find a good way to create a MetadataImplementor instance. I found a few options, but they all look fishy to me.

The only concrete implementations in Hibernate that I could find are MetadataImpl and InFlightMetadataCollectorImpl, but both are in org.hibernate.boot.internal, so I assume I'm not supposed to use them. Also, MetadataImpl has a massive constructor in which I need to provide every little detail and InFlightMetadataCollectorImpl needs a MetadataBuildingOptions, which has the same problems as MetadataImplementor (implementation is internal and hard to construct).

Alternatively, it looks like MetadataBuilderImpl might be a handy way to construct a MetadataImplementor, but it's also internal.

Either way, I couldn't find out how to set Properties (or their entries) on a MetadataImplementor (or MetadataBuilderImpl for that matter).

Question

Is a MetadataImplementor really required to create a SchemaExport? If so, how do I get one from a supported API and how do I set Properties?

Eventually we want to execute a script with execute, but the signature changed here as well. I see it now takes a ServiceRegistry, so maybe that would be a way out?

EDIT

Argh, I just saw that in 5.2 (which I want to use) SchemaExport does not even take a MetadataImplementor any longer - only the parameterless constructor remains. What now?

like image 756
Nicolai Parlog Avatar asked Nov 22 '17 10:11

Nicolai Parlog


3 Answers

In Hibernate, we have this base test class:

public class BaseNonConfigCoreFunctionalTestCase extends BaseUnitTestCase {
    public static final String VALIDATE_DATA_CLEANUP = "hibernate.test.validateDataCleanup";

    private StandardServiceRegistry serviceRegistry;
    private MetadataImplementor metadata;
    private SessionFactoryImplementor sessionFactory;

    private Session session;

    protected Dialect getDialect() {
        if ( serviceRegistry != null ) {
            return serviceRegistry.getService( JdbcEnvironment.class ).getDialect();
        }
        else {
            return BaseCoreFunctionalTestCase.getDialect();
        }
    }

    protected StandardServiceRegistry serviceRegistry() {
        return serviceRegistry;
    }

    protected MetadataImplementor metadata() {
        return metadata;
    }

    protected SessionFactoryImplementor sessionFactory() {
        return sessionFactory;
    }

    protected Session openSession() throws HibernateException {
        session = sessionFactory().openSession();
        return session;
    }

    protected Session openSession(Interceptor interceptor) throws HibernateException {
        session = sessionFactory().withOptions().interceptor( interceptor ).openSession();
        return session;
    }

    protected Session getSession() {
        return session;
    }

    protected void rebuildSessionFactory() {
        releaseResources();
        buildResources();
    }

    protected void cleanupCache() {
        if ( sessionFactory != null ) {
            sessionFactory.getCache().evictAllRegions();
        }
    }

    // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    // JUNIT hooks

    @BeforeClassOnce
    @SuppressWarnings( {"UnusedDeclaration"})
    protected void startUp() {
        buildResources();
    }

    protected void buildResources() {
        final StandardServiceRegistryBuilder ssrb = constructStandardServiceRegistryBuilder();

        serviceRegistry = ssrb.build();
        afterStandardServiceRegistryBuilt( serviceRegistry );

        final MetadataSources metadataSources = new MetadataSources( serviceRegistry );
        applyMetadataSources( metadataSources );
        afterMetadataSourcesApplied( metadataSources );

        final MetadataBuilder metadataBuilder = metadataSources.getMetadataBuilder();
        initialize( metadataBuilder );
        configureMetadataBuilder( metadataBuilder );

        metadata = (MetadataImplementor) metadataBuilder.build();
        applyCacheSettings( metadata );
        afterMetadataBuilt( metadata );

        final SessionFactoryBuilder sfb = metadata.getSessionFactoryBuilder();
        initialize( sfb, metadata );
        configureSessionFactoryBuilder( sfb );

        sessionFactory = (SessionFactoryImplementor) sfb.build();
        afterSessionFactoryBuilt( sessionFactory );
    }

    protected final StandardServiceRegistryBuilder constructStandardServiceRegistryBuilder() {
        final BootstrapServiceRegistryBuilder bsrb = new BootstrapServiceRegistryBuilder();
        bsrb.applyClassLoader( getClass().getClassLoader() );
        // by default we do not share the BootstrapServiceRegistry nor the StandardServiceRegistry,
        // so we want the BootstrapServiceRegistry to be automatically closed when the
        // StandardServiceRegistry is closed.
        bsrb.enableAutoClose();
        configureBootstrapServiceRegistryBuilder( bsrb );

        final BootstrapServiceRegistry bsr = bsrb.build();
        afterBootstrapServiceRegistryBuilt( bsr );

        final Map settings = new HashMap();
        addSettings( settings );

        final StandardServiceRegistryBuilder ssrb = new StandardServiceRegistryBuilder( bsr );
        initialize( ssrb );
        ssrb.applySettings( settings );
        configureStandardServiceRegistryBuilder( ssrb );
        return ssrb;
    }

    protected void addSettings(Map settings) {
    }

    /**
     * Apply any desired config to the BootstrapServiceRegistryBuilder to be incorporated
     * into the built BootstrapServiceRegistry
     *
     * @param bsrb The BootstrapServiceRegistryBuilder
     */
    @SuppressWarnings({"SpellCheckingInspection", "UnusedParameters"})
    protected void configureBootstrapServiceRegistryBuilder(BootstrapServiceRegistryBuilder bsrb) {
    }

    /**
     * Hook to allow tests to use the BootstrapServiceRegistry if they wish
     *
     * @param bsr The BootstrapServiceRegistry
     */
    @SuppressWarnings("UnusedParameters")
    protected void afterBootstrapServiceRegistryBuilt(BootstrapServiceRegistry bsr) {
    }

    @SuppressWarnings("SpellCheckingInspection")
    private void initialize(StandardServiceRegistryBuilder ssrb) {
        final Dialect dialect = BaseCoreFunctionalTestCase.getDialect();

        ssrb.applySetting( AvailableSettings.CACHE_REGION_FACTORY, CachingRegionFactory.class.getName() );
        ssrb.applySetting( AvailableSettings.USE_NEW_ID_GENERATOR_MAPPINGS, "true" );
        if ( createSchema() ) {
            ssrb.applySetting( AvailableSettings.HBM2DDL_AUTO, "create-drop" );
            final String secondSchemaName = createSecondSchema();
            if ( StringHelper.isNotEmpty( secondSchemaName ) ) {
                if ( !H2Dialect.class.isInstance( dialect ) ) {
                    // while it may be true that only H2 supports creation of a second schema via
                    // URL (no idea whether that is accurate), every db should support creation of schemas
                    // via DDL which SchemaExport can create for us.  See how this is used and
                    // whether that usage could not just leverage that capability
                    throw new UnsupportedOperationException( "Only H2 dialect supports creation of second schema." );
                }
                Helper.createH2Schema( secondSchemaName, ssrb.getSettings() );
            }
        }
        ssrb.applySetting( AvailableSettings.DIALECT, dialect.getClass().getName() );
    }

    protected boolean createSchema() {
        return true;
    }

    protected String createSecondSchema() {
        // poorly named, yes, but to keep migration easy for existing BaseCoreFunctionalTestCase
        // impls I kept the same name from there
        return null;
    }

    /**
     * Apply any desired config to the StandardServiceRegistryBuilder to be incorporated
     * into the built StandardServiceRegistry
     *
     * @param ssrb The StandardServiceRegistryBuilder
     */
    @SuppressWarnings({"SpellCheckingInspection", "UnusedParameters"})
    protected void configureStandardServiceRegistryBuilder(StandardServiceRegistryBuilder ssrb) {
    }

    /**
     * Hook to allow tests to use the StandardServiceRegistry if they wish
     *
     * @param ssr The StandardServiceRegistry
     */
    @SuppressWarnings("UnusedParameters")
    protected void afterStandardServiceRegistryBuilt(StandardServiceRegistry ssr) {
    }

    protected void applyMetadataSources(MetadataSources metadataSources) {
        for ( String mapping : getMappings() ) {
            metadataSources.addResource( getBaseForMappings() + mapping );
        }

        for ( Class annotatedClass : getAnnotatedClasses() ) {
            metadataSources.addAnnotatedClass( annotatedClass );
        }

        for ( String annotatedPackage : getAnnotatedPackages() ) {
            metadataSources.addPackage( annotatedPackage );
        }

        for ( String ormXmlFile : getXmlFiles() ) {
            metadataSources.addInputStream( Thread.currentThread().getContextClassLoader().getResourceAsStream( ormXmlFile ) );
        }
    }

    protected static final String[] NO_MAPPINGS = new String[0];

    protected String[] getMappings() {
        return NO_MAPPINGS;
    }

    protected String getBaseForMappings() {
        return "org/hibernate/test/";
    }

    protected static final Class[] NO_CLASSES = new Class[0];

    protected Class[] getAnnotatedClasses() {
        return NO_CLASSES;
    }

    protected String[] getAnnotatedPackages() {
        return NO_MAPPINGS;
    }

    protected String[] getXmlFiles() {
        return NO_MAPPINGS;
    }

    protected void afterMetadataSourcesApplied(MetadataSources metadataSources) {
    }

    protected void initialize(MetadataBuilder metadataBuilder) {
        metadataBuilder.enableNewIdentifierGeneratorSupport( true );
        metadataBuilder.applyImplicitNamingStrategy( ImplicitNamingStrategyLegacyJpaImpl.INSTANCE );
    }

    protected void configureMetadataBuilder(MetadataBuilder metadataBuilder) {
    }

    protected boolean overrideCacheStrategy() {
        return true;
    }

    protected String getCacheConcurrencyStrategy() {
        return null;
    }

    protected final void applyCacheSettings(Metadata metadata) {
        if ( !overrideCacheStrategy() ) {
            return;
        }

        if ( getCacheConcurrencyStrategy() == null ) {
            return;
        }

        for ( PersistentClass entityBinding : metadata.getEntityBindings() ) {
            if ( entityBinding.isInherited() ) {
                continue;
            }

            boolean hasLob = false;

            final Iterator props = entityBinding.getPropertyClosureIterator();
            while ( props.hasNext() ) {
                final Property prop = (Property) props.next();
                if ( prop.getValue().isSimpleValue() ) {
                    if ( isLob( ( (SimpleValue) prop.getValue() ).getTypeName() ) ) {
                        hasLob = true;
                        break;
                    }
                }
            }

            if ( !hasLob ) {
                ( ( RootClass) entityBinding ).setCacheConcurrencyStrategy( getCacheConcurrencyStrategy() );
            }
        }

        for ( Collection collectionBinding : metadata.getCollectionBindings() ) {
            boolean isLob = false;

            if ( collectionBinding.getElement().isSimpleValue() ) {
                isLob = isLob( ( (SimpleValue) collectionBinding.getElement() ).getTypeName() );
            }

            if ( !isLob ) {
                collectionBinding.setCacheConcurrencyStrategy( getCacheConcurrencyStrategy() );
            }
        }
    }

    private boolean isLob(String typeName) {
        return "blob".equals( typeName )
                || "clob".equals( typeName )
                || "nclob".equals( typeName )
                || Blob.class.getName().equals( typeName )
                || Clob.class.getName().equals( typeName )
                || NClob.class.getName().equals( typeName )
                || BlobType.class.getName().equals( typeName )
                || ClobType.class.getName().equals( typeName )
                || NClobType.class.getName().equals( typeName );
    }

    protected void afterMetadataBuilt(Metadata metadata) {
    }

    private void initialize(SessionFactoryBuilder sfb, Metadata metadata) {
        // todo : this is where we need to apply cache settings to be like BaseCoreFunctionalTestCase
        //      it reads the class/collection mappings and creates corresponding
        //      CacheRegionDescription references.
        //
        //      Ultimately I want those to go on MetadataBuilder, and in fact MetadataBuilder
        //      already defines the needed method.  But for the [pattern used by the
        //      tests we need this as part of SessionFactoryBuilder
    }

    protected void configureSessionFactoryBuilder(SessionFactoryBuilder sfb) {
    }

    protected void afterSessionFactoryBuilt(SessionFactoryImplementor sessionFactory) {
    }

    @AfterClassOnce
    @SuppressWarnings( {"UnusedDeclaration"})
    protected void shutDown() {
        releaseResources();
    }

    protected void releaseResources() {
        if ( sessionFactory != null ) {
            try {
                sessionFactory.close();
            }
            catch (Exception e) {
                System.err.println( "Unable to release SessionFactory : " + e.getMessage() );
                e.printStackTrace();
            }
        }
        sessionFactory = null;

        if ( serviceRegistry != null ) {
            try {
                StandardServiceRegistryBuilder.destroy( serviceRegistry );
            }
            catch (Exception e) {
                System.err.println( "Unable to release StandardServiceRegistry : " + e.getMessage() );
                e.printStackTrace();
            }
        }
        serviceRegistry=null;
    }

    @OnFailure
    @OnExpectedFailure
    @SuppressWarnings( {"UnusedDeclaration"})
    public void onFailure() {
        if ( rebuildSessionFactoryOnError() ) {
            rebuildSessionFactory();
        }
    }

    protected boolean rebuildSessionFactoryOnError() {
        return true;
    }

    @Before
    public final void beforeTest() throws Exception {
        prepareTest();
    }

    protected void prepareTest() throws Exception {
    }

    @After
    public final void afterTest() throws Exception {
        completeStrayTransaction();

        if ( isCleanupTestDataRequired() ) {
            cleanupTestData();
        }
        cleanupTest();

        cleanupSession();

        assertAllDataRemoved();
    }

    private void completeStrayTransaction() {
        if ( session == null ) {
            // nothing to do
            return;
        }

        if ( ( (SessionImplementor) session ).isClosed() ) {
            // nothing to do
            return;
        }

        if ( !session.isConnected() ) {
            // nothing to do
            return;
        }

        final TransactionCoordinator.TransactionDriver tdc =
                ( (SessionImplementor) session ).getTransactionCoordinator().getTransactionDriverControl();

        if ( tdc.getStatus().canRollback() ) {
            session.getTransaction().rollback();
        }
    }

    protected boolean isCleanupTestDataRequired() {
        return false;
    }

    protected void cleanupTestData() throws Exception {
        doInHibernate(this::sessionFactory, s -> {
            s.createQuery("delete from java.lang.Object").executeUpdate();
        });
    }


    private void cleanupSession() {
        if ( session != null && ! ( (SessionImplementor) session ).isClosed() ) {
            session.close();
        }
        session = null;
    }

    public class RollbackWork implements Work {
        public void execute(Connection connection) throws SQLException {
            connection.rollback();
        }
    }

    protected void cleanupTest() throws Exception {
    }

    @SuppressWarnings( {"UnnecessaryBoxing", "UnnecessaryUnboxing"})
    protected void assertAllDataRemoved() {
        if ( !createSchema() ) {
            return; // no tables were created...
        }
        if ( !Boolean.getBoolean( VALIDATE_DATA_CLEANUP ) ) {
            return;
        }

        Session tmpSession = sessionFactory.openSession();
        try {
            List list = tmpSession.createQuery( "select o from java.lang.Object o" ).list();

            Map<String,Integer> items = new HashMap<String,Integer>();
            if ( !list.isEmpty() ) {
                for ( Object element : list ) {
                    Integer l = items.get( tmpSession.getEntityName( element ) );
                    if ( l == null ) {
                        l = 0;
                    }
                    l = l + 1 ;
                    items.put( tmpSession.getEntityName( element ), l );
                    System.out.println( "Data left: " + element );
                }
                fail( "Data is left in the database: " + items.toString() );
            }
        }
        finally {
            try {
                tmpSession.close();
            }
            catch( Throwable t ) {
                // intentionally empty
            }
        }
    }



    public void inSession(Consumer<SessionImplementor> action) {
        log.trace( "#inSession(action)" );
        inSession( sessionFactory(), action );
    }

    public void inTransaction(Consumer<SessionImplementor> action) {
        log.trace( "#inTransaction(action)" );
        inTransaction( sessionFactory(), action );
    }

    public void inSession(SessionFactoryImplementor sfi, Consumer<SessionImplementor> action) {
        log.trace( "##inSession(SF,action)" );

        try (SessionImplementor session = (SessionImplementor) sfi.openSession()) {
            log.trace( "Session opened, calling action" );
            action.accept( session );
            log.trace( "called action" );
        }
        finally {
            log.trace( "Session close - auto-close lock" );
        }
    }

    public void inTransaction(SessionFactoryImplementor factory, Consumer<SessionImplementor> action) {
        log.trace( "#inTransaction(factory, action)");


        try (SessionImplementor session = (SessionImplementor) factory.openSession()) {
            log.trace( "Session opened, calling action" );
            inTransaction( session, action );
            log.trace( "called action" );
        }
        finally {
            log.trace( "Session close - auto-close lock" );
        }
    }

    public void inTransaction(SessionImplementor session, Consumer<SessionImplementor> action) {
        log.trace( "inTransaction(session,action)" );

        final Transaction txn = session.beginTransaction();
        log.trace( "Started transaction" );

        try {
            log.trace( "Calling action in txn" );
            action.accept( session );
            log.trace( "Called action - in txn" );

            log.trace( "Committing transaction" );
            txn.commit();
            log.trace( "Committed transaction" );
        }
        catch (Exception e) {
            log.tracef(
                    "Error calling action: %s (%s) - rolling back",
                    e.getClass().getName(),
                    e.getMessage()
            );
            try {
                txn.rollback();
            }
            catch (Exception ignore) {
                log.trace( "Was unable to roll back transaction" );
                // really nothing else we can do here - the attempt to
                //      rollback already failed and there is nothing else
                //      to clean up.
            }

            throw e;
        }
    }
}

which bootstrapts the Metadata and the ServiceRegistry.

So, we can call the SchemaExport like this:

new SchemaExport().create( EnumSet.of( TargetType.DATABASE ), metadata() );
like image 121
Vlad Mihalcea Avatar answered Oct 20 '22 01:10

Vlad Mihalcea


Distilling from other answers, I had to do the followings:

        StandardServiceRegistry standardRegistry = new StandardServiceRegistryBuilder()
            .applySetting("hibernate.hbm2ddl.auto", "create")
            .applySetting("hibernate.dialect", "org.hibernate.dialect.MySQLDialect")
            .applySetting("hibernate.id.new_generator_mappings", "true")
            .build();
    MetadataSources sources = new MetadataSources(standardRegistry);
    managedClassNames.forEach(sources::addAnnotatedClass);
    MetadataImplementor metadata = (MetadataImplementor) sources
            .getMetadataBuilder()
            .build();

    SchemaExport export = new SchemaExport(metadata);

Hopefully that help

like image 4
mmastika Avatar answered Oct 20 '22 03:10

mmastika


I like the diluted answer already posted. But here's a bit more detail that includes a few of the new built in Hibernate enums that allow you to create database tables with the SchemaExport class more programatically without a reliance on property file type settings like hbm2ddl.

Use a HashMap for properties

Normally I like to include all the database settings in Properties type of object like a Hashtable or HashMap. Then the HashMap is passed to the ServiceRegistry.

Map<String, String> settings = new HashMap<>();
settings.put("connection.driver_class", "com.mysql.jdbc.Driver");
settings.put("dialect", "org.hibernate.dialect.MySQLDialect");
settings.put("hibernate.connection.url", "jdbc:mysql://localhost/hibernate_examples");
settings.put("hibernate.connection.username", "root");
settings.put("hibernate.connection.password", "password");
ServiceRegistry serviceRegistry = new StandardServiceRegistryBuilder().applySettings(settings).build();

Add JPA entities to the MetadataSources

Then add all your JPA annotated classes to the MetadataSources object:

MetadataSources metadata = new MetadataSources(serviceRegistry);
metadata.addAnnotatedClass(Player.class);

Use Action and TargetType Enums

After all that's done, it's time to create the SchemaExport class and call it's execute method. In doing so, you can use the TargetType.DATABASE enum and the Action.BOTH enum instead of putting hbm2ddl into the settings such as:

applySetting("hibernate.hbm2ddl.auto", "create")

Here's how it looks:

EnumSet<TargetType> enumSet = EnumSet.of(TargetType.DATABASE);
SchemaExport schemaExport = new SchemaExport();
schemaExport.execute(enumSet, Action.BOTH, metadata.buildMetadata());

And sorry if this gets long, but here's the whole version 5 Hibernate SchemaExport code together in one go:

Map<String, String> settings = new HashMap<>();
settings.put("connection.driver_class", "com.mysql.jdbc.Driver");
settings.put("dialect", "org.hibernate.dialect.MySQLDialect");
settings.put("hibernate.connection.url", "jdbc:mysql://localhost/jpa");
settings.put("hibernate.connection.username", "root");
settings.put("hibernate.connection.password", "password");
settings.put("hibernate.show_sql", "true");
settings.put("hibernate.format_sql", "true");

ServiceRegistry serviceRegistry = new StandardServiceRegistryBuilder().applySettings(settings).build();

MetadataSources metadata = new MetadataSources(serviceRegistry);
//metadata.addAnnotatedClass(Player.class);

EnumSet<TargetType> enumSet = EnumSet.of(TargetType.DATABASE);
SchemaExport schemaExport = new SchemaExport();
schemaExport.execute(enumSet, Action.BOTH, metadata.buildMetadata());

The source code is available on GitHub.

like image 3
Cameron McKenzie Avatar answered Oct 20 '22 03:10

Cameron McKenzie