Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to resolve java.lang.OutOfMemoryError error by "java.lang.String", loaded by "<system class loader>" Eclipse Memory Analyzer

I am Reading Some Large XML files and Storing them into Database. It is arond 800 mb.

It stores many records and then terminates and gives an exception:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at java.util.IdentityHashMap.resize(Unknown Source)
    at java.util.IdentityHashMap.put(Unknown Source)

Using Memory Analyzer i have created .hprof files which says:

  76,581 instances of "java.lang.String", loaded by "<system class loader>" occupy 1,04,34,45,504 (98.76%) bytes. 

Keywords
java.lang.String

I have setters and getters for retrieving values.How do i resolve this issue. Any help would be appreaciated.

enter image description here

I have done with increasing memory through JRE .ini. but problem doesn't solved

EDIT: I am using scireumOpen to read XML files.

Example code i have used:

public void readD() throws Exception {

        XMLReader reader = new XMLReader();

        reader.addHandler("node", new NodeHandler() {

            @Override
            public void process(StructuredNode node) {
                try {



                    obj.setName(node
                            .queryString("name"));

                    save(obj);

                } catch (XPathExpressionException xPathExpressionException) {
                    xPathExpressionException.printStackTrace();
                } catch (Exception exception) {
                    exception.printStackTrace();
                }
            }
        });

        reader.parse(new FileInputStream(
                "C:/Users/some_file.xml"));

    }

    public void save(Reader obj) {

        try {
            EntityTransaction entityTransaction = em.getTransaction();
            entityTransaction.begin();
            Entity e1=new Entity;
            e1.setName(obj.getName());

            em.persist(e1);
            entityTransaction.commit();

        } catch (Exception exception) {
            exception.printStackTrace();
        }
    }
like image 737
Shiv Avatar asked Jan 25 '26 04:01

Shiv


2 Answers

Try using another parser for XML processing.

Processing one big XML file with 800M using e.g. DOM is not feasible as it takes up really much memory.

Try using SAX ot StAX in Java and process the parsing results at once without trying to load the complete XML file into memory.

And also don't keep the parsing result in memory in total. Write them as fast as possible into the database and scope your parsing results as narrow as possible.

Perhaps use intermediate tables in database and do the processing part on all datasets inside the database.

like image 127
Uwe Plonus Avatar answered Jan 26 '26 16:01

Uwe Plonus


Your heap is not limited and cannot hold such a big xml in memory. Try to increase the heap size using -Xmx JRE options.

or

try to use http://vtd-xml.sourceforge.net/ for faster and lighter xml processing.

like image 43
Juned Ahsan Avatar answered Jan 26 '26 16:01

Juned Ahsan