Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Making the kernel slightly more generic #57

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 13 additions & 6 deletions qek/kernel/kernel.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,8 +84,6 @@ def __call__(

# Note: At this stage, size_max could theoretically still be `None``, if both `X1` and `X2`
# are empty. In such cases, `dist_excitation` will never be called, so we're ok.

mu = float(self.params["mu"])
feat_rows = [row.dist_excitation(size_max) for row in X1]

if X2 is None:
Expand All @@ -100,8 +98,7 @@ def __call__(
for i, dist_row in enumerate(feat_rows):
for j in range(i, len(feat_rows)):
dist_col = feat_rows[j]
js = jensenshannon(dist_row, dist_col) ** 2
similarity = np.exp(-mu * js)
similarity = self.distance(dist_row, dist_col)
kernel[i, j] = similarity
if j != i:
kernel[j, i] = similarity
Expand All @@ -113,10 +110,20 @@ def __call__(
feat_columns = [col.dist_excitation(size_max) for col in X2]
for i, dist_row in enumerate(feat_rows):
for j, dist_col in enumerate(feat_columns):
js = jensenshannon(dist_row, dist_col) ** 2
kernel[i, j] = np.exp(-mu * js)
kernel[i, j] = self.distance(dist_row, dist_col)
return kernel

def distance(self, row: NDArray[np.floating], col: NDArray[np.floating]) -> np.floating:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems fair. Consider renaming it to similarity/divergence to align more with ml frameworks.

Do we want to provide the user the functionality to modify this method?
If so, maybe we should add an argument in the init - called similarity_fn - which if provided will be used, other wise the default one that we have will be used.

"""
The distance metric used to compute the kernel, used when calling `kernel(X1, X2)`.

To experiment with other distances, you may subclass `QuantumEvolutionKernel` and overload
this class.
"""
js = jensenshannon(row, col) ** 2
mu = float(self.params["mu"])
return np.exp(-mu * js)

def similarity(self, graph_1: ProcessedData, graph_2: ProcessedData) -> float:
"""Compute the similarity between two graphs using Jensen-Shannon
divergence.
Expand Down
9 changes: 9 additions & 0 deletions qek/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,15 @@ def inverse_one_hot(array: npt.ArrayLike, dim: int) -> np.ndarray:


def make_sequence(device: Device, pulse: Pulse, register: Register) -> pulser.Sequence:
"""
Build a sequence for a device from a pulse and a register.

This function is mostly intended for internal use and will likely move to qool-layer
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add args in the doc string.

in time.

Raises:
CompilationError if the pulse + register are not compatible with the device.
"""
try:
sequence = pulser.Sequence(register=register, device=device)
sequence.declare_channel("ising", "rydberg_global")
Expand Down