Discussion Closed This discussion was created more than 6 months ago and has been closed. To start a new discussion with a link back to this one, click here.

outofmemory

Please login with a confirmed email address before reporting spam

Hello
I have a model of contact, when the grid have done very fine , it cannot solve. it posts an error


Exception:
com.femlab.jni.FlNativeException: Out of memory during assembly
Messages:
Out of memory during assembly

Stack trace:
at xmodel.cpp, row 2320, ()
at com.femlab.solver.FlSolver.femStatic(Native Method)
at com.femlab.solver.FemStatic.run(Unknown Source)
at com.femlab.server.FlRunner.run(Unknown Source)
at com.femlab.util.i.run(Unknown Source)
at com.femlab.util.aa.run(Unknown Source)


I need your council to deal with this problem,
thank you

11 Replies Last Post Mar 21, 2010, 5:34 a.m. EDT

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Nov 16, 2009, 5:45 a.m. EST
I belive this problem is very similar to one which I have encountered during solving very fine meshed problems(comsol sayed : out of memory during LU factorization) . The operation's demand for memory just exceeds your RAM. I cant say why it is not possible to use SWAP or TEMP for data storage, but I think you can solve it by rendering the mesh less fine(if the model remains reliable, of course).
I belive this problem is very similar to one which I have encountered during solving very fine meshed problems(comsol sayed : out of memory during LU factorization) . The operation's demand for memory just exceeds your RAM. I cant say why it is not possible to use SWAP or TEMP for data storage, but I think you can solve it by rendering the mesh less fine(if the model remains reliable, of course).

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Nov 17, 2009, 12:24 p.m. EST
Hi Hoa,

I also had some experiences with the problem. Let try with the following things:
- Make mesh less fine with sub-domains not important or all of sub-domains
- Using symmetrical geometry properties to model for only small parts of device or machine, so you can reduce required memory significantly and your program will run faster.

Good luck!

Hung Vu Xuan
Hi Hoa, I also had some experiences with the problem. Let try with the following things: - Make mesh less fine with sub-domains not important or all of sub-domains - Using symmetrical geometry properties to model for only small parts of device or machine, so you can reduce required memory significantly and your program will run faster. Good luck! Hung Vu Xuan

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Nov 18, 2009, 8:52 a.m. EST
Hi Hung,
Thank you all. I'll try it
Hi Hung, Thank you all. I'll try it

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Dec 13, 2009, 6:38 p.m. EST
On Page 124 of v3.5a User's Guide there is a description of how to increase the default Java heap space by changing the MAXHEAP variable in the file comsol.opts, which is loaded on launch. The default value that came with v3.5 was 256 MB. Try increasing this, taking into account how much memory you have in your machine. Let us know how this works.
On Page 124 of v3.5a User's Guide there is a description of how to increase the default Java heap space by changing the MAXHEAP variable in the file comsol.opts, which is loaded on launch. The default value that came with v3.5 was 256 MB. Try increasing this, taking into account how much memory you have in your machine. Let us know how this works.

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Dec 13, 2009, 6:43 p.m. EST
Another approach might be to experiment with using different linear solvers. Some of them require a contiguous heap, some don't, and at least one of them allegedly does the calculation out of core. This last presumably means the solver breaks it up into bits and swapping parts in/out from disk as needed. It would would be slow, but from my reading may be the only way to handle really large problems.
Another approach might be to experiment with using different linear solvers. Some of them require a contiguous heap, some don't, and at least one of them allegedly does the calculation out of core. This last presumably means the solver breaks it up into bits and swapping parts in/out from disk as needed. It would would be slow, but from my reading may be the only way to handle really large problems.

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Mar 19, 2010, 10:12 a.m. EDT
Hi - since I am running into the sampe problem of too little memory I am looking for answers and found your comment. I dont't think that increasing Java's heap space does solve the problem because the solvers don't use the heap as far as I know. Therefore, it might even be better to decrease the size of the heap to free more RAM for the solver. Any experiences with that?
Pardiso out-of-core also ran out of memory in one of my problems and I cannot make the mesh coarser or else the solution is not correct. I keep trying other solvers...
Hi - since I am running into the sampe problem of too little memory I am looking for answers and found your comment. I dont't think that increasing Java's heap space does solve the problem because the solvers don't use the heap as far as I know. Therefore, it might even be better to decrease the size of the heap to free more RAM for the solver. Any experiences with that? Pardiso out-of-core also ran out of memory in one of my problems and I cannot make the mesh coarser or else the solution is not correct. I keep trying other solvers...

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Mar 19, 2010, 12:27 p.m. EDT
Hi,

MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver.
There is a limit in your memory which is more related to the physical part ,the hardware that you have.
The only thing that you can do is to play with the mesh, switch between direct and indirect solvers.

The radical or the ultimate solution is to ask/buy more memory.

Good luck
Hi, MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver. There is a limit in your memory which is more related to the physical part ,the hardware that you have. The only thing that you can do is to play with the mesh, switch between direct and indirect solvers. The radical or the ultimate solution is to ask/buy more memory. Good luck

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Mar 19, 2010, 1:57 p.m. EDT
I have received similar error with very few and coarse mesh on a very simple geometry. I have 3GB available memory and >2Ghz dual core CPU! Isn't it odd? I have used to run other commercial programs with much more mesh and more complicated geometry with a weaker machine. So, I think it may not be necessarily only a memory problem.


Hi,

MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver.
There is a limit in your memory which is more related to the physical part ,the hardware that you have.
The only thing that you can do is to play with the mesh, switch between direct and indirect solvers.

The radical or the ultimate solution is to ask/buy more memory.

Good luck


I have received similar error with very few and coarse mesh on a very simple geometry. I have 3GB available memory and >2Ghz dual core CPU! Isn't it odd? I have used to run other commercial programs with much more mesh and more complicated geometry with a weaker machine. So, I think it may not be necessarily only a memory problem. [QUOTE] Hi, MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver. There is a limit in your memory which is more related to the physical part ,the hardware that you have. The only thing that you can do is to play with the mesh, switch between direct and indirect solvers. The radical or the ultimate solution is to ask/buy more memory. Good luck [/QUOTE]

Jim Freels mechanical side of nuclear engineering, multiphysics analysis, COMSOL specialist

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Mar 20, 2010, 10:58 p.m. EDT
I would suggest first to use less mesh if possible. If not possible to use less mesh, then get a 64-bit computer with more memory. You can also try the segregated solver and iterative solver to reduce memory requirements. Direct solvers use the most memory.
I would suggest first to use less mesh if possible. If not possible to use less mesh, then get a 64-bit computer with more memory. You can also try the segregated solver and iterative solver to reduce memory requirements. Direct solvers use the most memory.

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Mar 21, 2010, 5:23 a.m. EDT
Since posting my comment above I bought a new quad-core machine with 24 GB memory, and am running Win7-64bit. The old our-of-memory problems disappeared, but it's easy to make them reappear again with a fine enough mesh. Hit the refine-mesh enough times and eventually you run out of memory. For me now it is at 38M nodes. Even for "only" 8M nodes, the solvers ramp up the memory usage all the way up to 24GB, and then it starts swapping.

However, there is now another problem not directly related to solving a particular problem, but rather rendering it on the screen. For mesh sizes that are too large you can get another, but different, kind of out-of-memory problem that reflects the size of your VRAM. However, this can be solved by turning off the automatic rendering, running your problem, then reducing the size of the mesh for the solution so that it will display.
Since posting my comment above I bought a new quad-core machine with 24 GB memory, and am running Win7-64bit. The old our-of-memory problems disappeared, but it's easy to make them reappear again with a fine enough mesh. Hit the refine-mesh enough times and eventually you run out of memory. For me now it is at 38M nodes. Even for "only" 8M nodes, the solvers ramp up the memory usage all the way up to 24GB, and then it starts swapping. However, there is now another problem not directly related to solving a particular problem, but rather rendering it on the screen. For mesh sizes that are too large you can get another, but different, kind of out-of-memory problem that reflects the size of your VRAM. However, this can be solved by turning off the automatic rendering, running your problem, then reducing the size of the mesh for the solution so that it will display.

Please login with a confirmed email address before reporting spam

Posted: 1 decade ago Mar 21, 2010, 5:34 a.m. EDT
About a slightly different set of limitations - I wanted to know where a simulation was spending most its time, so I exported to an M-file and ran it from Matlab with the profiler turned on. It turns out it was spending most of its time in the Java garbage collector. So, one wonders if perhaps changing the heap size might have an effect at least on the computation time.
About a slightly different set of limitations - I wanted to know where a simulation was spending most its time, so I exported to an M-file and ran it from Matlab with the profiler turned on. It turns out it was spending most of its time in the Java garbage collector. So, one wonders if perhaps changing the heap size might have an effect at least on the computation time.

Note that while COMSOL employees may participate in the discussion forum, COMSOL® software users who are on-subscription should submit their questions via the Support Center for a more comprehensive response from the Technical Support team.