Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added option to allow openPMD-api directly from AMReX I/O #489

Open
wants to merge 5 commits into
base: development
Choose a base branch
from

Conversation

guj
Copy link

@guj guj commented Dec 20, 2023

This pull request is to be merged after AMReX openPMD-api I/O is approved: AMReX-Codes/amrex#3666

Comment on lines +195 to +204
/*
using SrcData = WarpXParticleContainer::ParticleTileType::ConstParticleTileDataType;
tmp.copyParticles(*pc,
[=] AMREX_GPU_HOST_DEVICE (const SrcData& src, int ip, const amrex::RandomEngine& engine)
{
const SuperParticleType& p = src.getSuperParticle(ip);
return random_filter(p, engine) * uniform_filter(p, engine)
* parser_filter(p, engine) * geometry_filter(p, engine);
}, true);
*/

Check notice

Code scanning / CodeQL

Commented-out code Note

This comment appears to contain commented-out code.
Comment on lines +162 to +174
/* should be covered by amrex-openpmd-io
// SoA: Real
{
std::vector<std::string> real_soa_names(RealSoA::names_s.size());
std::copy(RealSoA::names_s.begin(), RealSoA::names_s.end(), real_soa_names.begin());
for (auto real_idx = 0; real_idx < RealSoA::nattribs; real_idx++) {
auto const component_name = real_soa_names.at(real_idx);
getComponentRecord(component_name).resetDataset(d_fl);
}
}
// SoA: Int
static_assert(IntSoA::nattribs == 0); // not yet used
*/

Check notice

Code scanning / CodeQL

Commented-out code Note

This comment appears to contain commented-out code.
@ax3l ax3l self-requested a review December 22, 2023 19:08
@ax3l ax3l self-assigned this Dec 22, 2023
@ax3l ax3l added the component: diagnostics all types of outputs label Dec 22, 2023
@@ -73,6 +73,8 @@ macro(find_ablastr)

# shared libs, i.e. for Python bindings, need relocatable code
if(ImpactX_PYTHON OR BUILD_SHARED_LIBS)
set(AMReX_TINY_PROFILE ON CACHE BOOL "")
set(AMReX_OPENPMD_API ON CACHE BOOL "")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have our own variable for this:

Suggested change
set(AMReX_OPENPMD_API ON CACHE BOOL "")
set(AMReX_OPENPMD ${ImpactX_OPENPMD} CACHE BOOL "" FORCE)

Comment on lines +271 to +277
} else if (element_type == "beam_plotplus") {
#ifdef AMREX_USE_OPENPMD_API
std::string openpmd_name = element_name;
pp_element.queryAdd("name", openpmd_name);
std::string openpmd_backend = "default";
pp_element.queryAdd("backend", openpmd_backend);
std::string openpmd_encoding{"g"};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's simply update beam_monitor and let us not add another element.

@@ -36,6 +36,7 @@
#include "SoftQuad.H"
#include "ThinDipole.H"
#include "diagnostics/openPMD.H"
#include "diagnostics/plotplus.H"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
#include "diagnostics/plotplus.H"

Comment on lines +56 to +58
#ifdef AMREX_USE_OPENPMD_API
diagnostics::BeamPlotplus,
#endif
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's update diagnostics::BeamMonitor instead

Suggested change
#ifdef AMREX_USE_OPENPMD_API
diagnostics::BeamPlotplus,
#endif

@@ -1,4 +1,5 @@
target_sources(lib
PRIVATE
openPMD.cpp
plotplus.cpp
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
plotplus.cpp

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: diagnostics all types of outputs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants