Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dsl: Move SubDimension thickness evaluation to the thicknesses themselves #2470

Merged
merged 20 commits into from
Nov 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion devito/data/decomposition.py
Original file line number Diff line number Diff line change
Expand Up @@ -320,7 +320,7 @@ def index_glb_to_loc(self, *args, rel=True):
if self.loc_empty:
return None
abs_ofs, side = args
if side is LEFT:
if side == LEFT:
rel_ofs = self.glb_min + abs_ofs - base
if abs_ofs >= base and abs_ofs <= top:
return rel_ofs
Expand Down
28 changes: 16 additions & 12 deletions devito/ir/equations/algorithms.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
ConditionalDimension)
from devito.types.array import Array
from devito.types.basic import AbstractFunction
from devito.types.dimension import MultiSubDimension
from devito.types.dimension import MultiSubDimension, Thickness
from devito.data.allocators import DataReference
from devito.logger import warning

Expand Down Expand Up @@ -213,18 +213,24 @@ def _(expr, mapper, rebuilt, sregistry):
_concretize_subdims(expr.implicit_dims, mapper, rebuilt, sregistry)


@_concretize_subdims.register(Thickness)
def _(tkn, mapper, rebuilt, sregistry):
if tkn in mapper:
# Already have a substitution for this thickness
return

mapper[tkn] = tkn._rebuild(name=sregistry.make_name(prefix=tkn.name))


@_concretize_subdims.register(SubDimension)
def _(d, mapper, rebuilt, sregistry):
if d in mapper:
# Already have a substitution for this dimension
return

tkns_subs = {tkn: tkn._rebuild(name=sregistry.make_name(prefix=tkn.name))
for tkn in d.tkns}
left, right = [mM.subs(tkns_subs) for mM in (d.symbolic_min, d.symbolic_max)]
thickness = tuple((v, d._thickness_map[k]) for k, v in tkns_subs.items())

mapper[d] = d._rebuild(symbolic_min=left, symbolic_max=right, thickness=thickness)
tkns = tuple(t._rebuild(name=sregistry.make_name(prefix=t.name)) for t in d.tkns)
mapper.update({tkn0: tkn1 for tkn0, tkn1 in zip(d.tkns, tkns)})
mapper[d] = d._rebuild(thickness=tkns)


@_concretize_subdims.register(ConditionalDimension)
Expand Down Expand Up @@ -261,11 +267,9 @@ def _(d, mapper, rebuilt, sregistry):
# Already have a substitution for this dimension
return

abstract_tkns = MultiSubDimension._symbolic_thickness(d.parent.name)
concrete_tkns = tuple(tkn._rebuild(name=sregistry.make_name(prefix=tkn.name))
for tkn in abstract_tkns)

kwargs = {'thickness': concrete_tkns}
tkns = tuple(tkn._rebuild(name=sregistry.make_name(prefix=tkn.name))
for tkn in d.thickness)
kwargs = {'thickness': tkns}
fkwargs = {}

idim0 = d.implicit_dimension
Expand Down
2 changes: 1 addition & 1 deletion devito/ir/support/basic.py
Original file line number Diff line number Diff line change
Expand Up @@ -1376,7 +1376,7 @@ def disjoint_test(e0, e1, d, it):
if d.is_Custom:
subs = {}
elif d.is_Sub and d.is_left:
subs = {d.root.symbolic_min: 0, **dict([d.thickness.left])}
subs = {d.root.symbolic_min: 0, d.ltkn: d.ltkn.value}
else:
return False

Expand Down
3 changes: 1 addition & 2 deletions devito/mpi/halo_scheme.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,8 +89,7 @@ def __init__(self, exprs, ispace):
if d._defines & self.dimensions])
subdims = [d for d in dims if d.is_Sub and not d.local]
for i in subdims:
ltk, _ = i.thickness.left
rtk, _ = i.thickness.right
ltk, rtk = i.tkns
self._honored[i.root] = frozenset([(ltk, rtk)])
self._honored = frozendict(self._honored)

Expand Down
7 changes: 7 additions & 0 deletions devito/operator/operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,8 @@
split, timed_pass, timed_region, contains_val)
from devito.types import (Buffer, Grid, Evaluable, host_layer, device_layer,
disk_layer)
from devito.types.dimension import Thickness


__all__ = ['Operator']

Expand Down Expand Up @@ -640,6 +642,11 @@ def _prepare_arguments(self, autotune=None, **kwargs):
for d in reversed(toposort):
args.update(d._arg_values(self._dspace[d], grid, **kwargs))

# Process Thicknesses
for p in self.parameters:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should be triggered from within SubDimension._arg_values itself

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, the issue is that the SubDimension isn't necessarily available to the operator. You can craft legal Operators which only contain the thickness, not the parent subdimension, hence the reworking in this PR

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why isn't it caught by self.input 's default line 487

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assuming "it" means the subdomain, you can craft a legal operator as in the MFE at the top of this PR where the Operator never sees the SubDimension, only its thickness symbols

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't mean the subdomain no, I mean the thickness, you're saying you can create an operator where the Thickness (p here) is in op.parameters but is not in op.inputs?

If it's the case, i.e if it is not in inputs, then it should not need to be in args since it's not an input.
If it's not the case, the line 487 processes all inputs and should cover that case already

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried it, and this is not the case

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How is that possible? It's clearly in parameters since you added this so if you add is_Input and it doesn't catch it then something is broken

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's no grid in the kwargs at the point where _arg_values is called on the inputs, which makes it no-op. It also needs to be done before objects get _arg_values called on them iirc, hence the current position of this loop.

You could reorder _prepare_arguments to process the grid earlier, but my functions-on-subdomains branch has a bunch of reworking here and fixing the merge conflicts will be a pain, so best to stick a pin in it and fix down the line imo.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you open an issue about it then

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also don't understand this thing honestly. We need an issue and ideally a comment on top of this loop explaining what's going on and what should actually happen instead. You can do it in another branch as it's a minor thing

if isinstance(p, Thickness):
args.update(p._arg_values(grid=grid, **kwargs))

# Process Objects
for o in self.objects:
args.update(o._arg_values(grid=grid, **kwargs))
Expand Down
Loading
Loading