-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Description
Example:
Marginal probabilities that I calculated for 2 samples (3 categories):
tensor([[[-117.2179, -121.6661, -126.2901]],
[[-163.1370, -166.3467, -173.6576]],
Conditional probability function called without specifying target_vars, and just indexing correctly:
tensor([[[9.8831e-01, 1.1574e-02, 1.1354e-04]],
[[9.6119e-01, 3.8789e-02, 2.5930e-05]],
If I exponentiate and normalize the marginals, I get the same results as the conditional function above. However, when I specify target_vars it doesn't seem like the numbers are normalized yet. Also, even if I normalize these values, I don't think they give exactly the same probabilities as calling the conditional function and taking the proper index:
tensor([[[2.3663e-11, 1.2109e-13, 7.0562e-16]],
[[1.2965e-01, 6.6344e-04, 3.8661e-06]],
The example in the documentation does seem to work in both cases:
outputs = juice.queries.conditional(
pc, data = data, missing_mask = missing_mask, target_vars = [1]
)
print(outputs)
outputs = juice.queries.conditional(
pc, data = data, missing_mask = missing_mask
)
print(outputs[:, 1, :])
tensor([[[0.1053, 0.4940, 0.2592, 0.1415]],
[[0.1350, 0.3746, 0.3253, 0.1650]]], device='cuda:0')
tensor([[0.1053, 0.4940, 0.2592, 0.1415],
[0.1350, 0.3746, 0.3253, 0.1650]], device='cuda:0')
But this is quite small. Maybe there is some issue when scaling up or generally based on how the circuit is defined?
Metadata
Metadata
Assignees
Labels
No labels