Replies: 3 comments 6 replies
-
Yes, the problem is due to boundary condition implementation along with periodic BC. The Neumann pressure and no-slip boundary conditions require an extrapolation step just before acceleration computation. Ideally, one should copy the extrapolated properties to the corresponding ones as they have the pressure from the last update only. This does not affect the values for shorter time, but it accumulates for longer time. An easy fix is do A computationally faster fix is to write an equation and copy appropriate properties to the corresponding periodic particle. |
Beta Was this translation helpful? Give feedback.
-
Hi, So your time-step dt is 3e-2, and spacing dx, I assume, is around 1e-2. So even if we consider a second-order accurate time-stepping and second-order accurate space discretization, the error will be of order dt^2 * dx^2 = 9e-8. So after 5.3e7 iterations the error accumulation will be of order 4.5e-1 which is huge. Therefore, after so many iterations you should not expect the error to be less. These were hand wavy calculations. However, I hope you get the point. If you want to run a lot of iterations, your time integrator and space discretization must be highly accurate. |
Beta Was this translation helpful? Give feedback.
-
It provides the ghost particles the updated values from previous group. So, it perform nnps_update().
Yes.
Sometimes, you may have some particle shifting to other calculations before actual self.compute_accelerations() in do_post_stage(). Therefore, to ensure updated ghost, nnps_update() is there. you can customize it if you want and make it faster.
Find in the original reply. |
Beta Was this translation helpful? Give feedback.
-
Dear PySPH community,
I was trying to perform some simulations but I kept running into instabilities. I think the simulations that I tried were relatively long and that the problem shows up only if simulations took a certain amount of simulation time. In order to test this hypothesis, I tried running a preinstalled example (couette.py) that used the same scheme (TVFScheme) as I was using for a very long simulation time to see if these instabilities also would show up there. So, the only thing I changed compared to the original preinstalled example was --tf 1920000.0 instead of --tf 100.0.
The results indeed showed the instability that I also encountered in my own simulations, so I know I didn't make a mistake somewhere. Nevertheless, I'm really wondering what this instability might be and how it is caused. At first, to me it seemed to be a tensile instability but tensile instabilities should be suppressed in the TVF scheme. I also tried to increase the background pressure of the TVF scheme to make sure the the effect is strong enough, but this also didn't help. Therefore, I suppose it is not a tensile instability. Furthermore, I also tested the scheme with an angular conserving viscous term in the momentum equation instead of the one used by Adami et al., but unfortunately this didn't help either.
I noticed that he instabilities start near the corners of the periodic domain so might it have something to do with the implementation of periodic boundary conditions in PySPH or with the solid wall boundary conditions as prescribed by Adami et al.? I would like to hear your views on this and am looking forward to a discussion.
You can find the full output at every 800 iterations here: couette_output.zip. To illustrate the issue I have attached two screenshots of the output at different times (one while it is still stable and one where particles are flying around vigorously).
Beta Was this translation helpful? Give feedback.
All reactions