Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Add initialization option with constant particle weighting #3213

Open
BrianMarre opened this issue Mar 17, 2020 · 3 comments
Assignees
Labels
component: core in PIConGPU (core application) component: user input signals changes in user API such as .param files, .cfg syntax, etc. - changelog! question

Comments

@BrianMarre
Copy link
Member

Short version: Implementing this would be the first order solution I require for collisional excitation/ionization in my topic-atomicPhysics pull request.

Longer Version:
I am currently implementing the change of atomic state due to ion-electron collisions, on super cell level.
The basic idea is to change the single atomic state stored in an ion-macroparticle due to collisions with the electron velocity distribution.
If the velocity distribution is not isotrope, the collision rate R of an ion depends on the relative velocity of this ion and electrons of a given energy E.

R_col = | v_rel(E_electron) | * sigma_{Cross Section} (E_electron) * density_ne(E_electron)

For ion acceleration, neither electron- nor ion-velocity distributions are isotrope and therefore I would need the actual relative velocity to calculate the collision rate.

The naive approach would be to use a 3-dimensional electron velocity histogram and calculate the relative velocity of a given bin to a given ion-macroparticles velocity, repeat for all ion-macroparticles.
Unfortunately I need a good enough energy resolution, eg. something like 256 bins per dimension, which would result in 256^3 different bins, with the resulting histogram being too large to fit into shared memory and the simulation having too few macroparticles to actually sample the distribution enough, only approx. 5150 macroparcticles.

I want to avoid this by using a random binary pairing of electron- and ion-macroparticles.
To conserve energy, the kinetic energy of the colliding electron must be reduced or increased by the energy consumed/released by the atomic state change, if a collision happens.
If both particles have the same weight this is simply a matter of scaling the macroparticles velocity. If the weights are not equal, I could scale the velocity as if the weights were equal, this could break energy conservation, since the energy take from the electron macroparticle is higher or lower than the atomic state energy, or scale the macroparticles velocity differently depending on the weighting relation of ion- and electron- macroparticle, this would sacrifice the correct change of velocity and thereby PIC-dynamic, by averaging over to different electron populations.

Other possible solutions might be,

  • split paired macroparticles such that the surplus weighting is transferred to a third particle.
  • sum all changes of energy for electrons with a given energy and apply them evenly distributed by weight to all electron macroparticles in the respective super cell.
@BrianMarre
Copy link
Member Author

@n01r and @ax3l, do you have any further ideas, how this might be solved?
@psychocoderHPC are there any further considerations I should be aware of?

@n01r
Copy link
Member

n01r commented Mar 17, 2020

first offline discussion result: start with the implementation of an isotropic velocity distribution since the ion velocity will at first be negligible compared to the electron velocity. That is the widely accepted method against which you can compare later when you move on from there with the next, more complex, treatment.

@n01r n01r added question component: core in PIConGPU (core application) component: user input signals changes in user API such as .param files, .cfg syntax, etc. - changelog! labels Mar 18, 2020
@HighIander
Copy link
Member

A solution to the problem of " the simulation having too few macroparticles to actually sample the distribution enough, only approx. 5150 macroparcticles." would be to use a dynamic lookup table and of an index i to energy bin j and only use those j that are sufficiently occupied. Your histogram then only goes over i, which number is much smaller than the number of all j = 256^3 in you example. In most situations that would also solve the problem "with the resulting histogram being too large to fit into shared memory", but one would have to careful for those cases where it wouldn't.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: core in PIConGPU (core application) component: user input signals changes in user API such as .param files, .cfg syntax, etc. - changelog! question
Projects
None yet
Development

No branches or pull requests

4 participants