Crashes with too many records

Issue #30 resolved
Murray Bryant created an issue

When i try and run an import with a large number of records i get the following message. Is there a configuration change needed to allow large imports?

I am running this on OSX

Exception in thread "Thread-9" java.lang.IllegalStateException: Attempt to allocate #17 extra segment tier, 16 is maximum. Possible reasons include: - you have forgotten to configure (or configured wrong) builder.entries() number - same regarding other sizing Chronicle Hash configurations, most likely maxBloatFactor(), averageKeySize(), or averageValueSize() - keys, inserted into the ChronicleHash, are distributed suspiciously bad. This might be a DOS attack at net.openhft.chronicle.hash.impl.VanillaChronicleHash.allocateTier(VanillaChronicleHash.java:767) at net.openhft.chronicle.map.impl.CompiledMapQueryContext.nextTier(CompiledMapQueryContext.java:2858) at net.openhft.chronicle.map.impl.CompiledMapQueryContext.alloc(CompiledMapQueryContext.java:3022) at net.openhft.chronicle.map.impl.CompiledMapQueryContext.initEntryAndKey(CompiledMapQueryContext.java:3436) at net.openhft.chronicle.map.impl.CompiledMapQueryContext.putEntry(CompiledMapQueryContext.java:3891) at net.openhft.chronicle.map.impl.CompiledMapQueryContext.doInsert(CompiledMapQueryContext.java:4080) at net.openhft.chronicle.map.MapEntryOperations.insert(MapEntryOperations.java:157) at net.openhft.chronicle.map.impl.CompiledMapQueryContext.insert(CompiledMapQueryContext.java:4051) at net.openhft.chronicle.map.MapMethods.putIfAbsent(MapMethods.java:121) at net.openhft.chronicle.map.VanillaChronicleMap.putIfAbsent(VanillaChronicleMap.java:517) at net.openhft.chronicle.set.SetFromMap.add(SetFromMap.java:74) at com.graphaware.neo4j.databridge.catalogs.edge.EdgeCatalog.add(EdgeCatalog.java:86) at com.graphaware.neo4j.databridge.catalogs.edge.EdgeCatalog.register(EdgeCatalog.java:78) at com.graphaware.neo4j.databridge.DatabridgeWorker.versionEdge(DatabridgeWorker.java:221) at com.graphaware.neo4j.databridge.DatabridgeWorker.createEdge(DatabridgeWorker.java:203) at com.graphaware.neo4j.databridge.DatabridgeWorker.createEdges(DatabridgeWorker.java:176) at com.graphaware.neo4j.databridge.DatabridgeWorker.createGraphObjects(DatabridgeWorker.java:70) at com.graphaware.neo4j.databridge.DatabridgeWorker.run(DatabridgeWorker.java:82) at java.lang.Thread.run(Thread.java:745)

Comments (6)

  1. Murray Bryant reporter

    When I restrict the dataset to 2000 it runs through the test of the import with no errors.

  2. Vince Bickers repo owner

    Hi Murray,

    This issue is related to the other one you raised #32

    I can reproduce it using the demo datasets, e.g:

    bin/databridge import -l 2 demo/satellites
    ...
    bin/databridge import demo/satellites
    

    The workaround is to use the -d flag on the second and subsequent imports.

    On balance I'm inclined to regard this as a bug, because the use of the -l flag to limit data input would indicate that you're still testing, so I'll look into fixing this.

    In the meantime please let me know if the workaround is ok for you.

    Regards, Vince

  3. Murray Bryant reporter

    Hi Vince

    I can replicate this issue on the first run of test of the dataset after deleting out existing data directory.

    Do I need to delete out some data somewhere else?

    regards

    Murray

  4. Log in to comment