Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

shapes (128418,3) and (0,124) not aligned: 3 (dim 1) != 0 (dim 0) #172

Open
ElifAyten opened this issue Feb 27, 2025 · 0 comments
Open

shapes (128418,3) and (0,124) not aligned: 3 (dim 1) != 0 (dim 0) #172

ElifAyten opened this issue Feb 27, 2025 · 0 comments

Comments

@ElifAyten
Copy link

Hello ,

I am implementing an SLDS model using the ssm library and experimenting with different hyperparameters. My dataset contains both neuronal firing rates and external inputs (footshock, pupil size, speed). I want to use both as influences on the model by setting the transition type to recurrent.

However, when I include external_inputs and use transitions="recurrent", I encounter the following shape mismatch error:
`---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
in <cell line: 0>()
8 )
9
---> 10 elbos_recurrent, _ = rslds_recurrent_inputs.fit(
11 [firing_rates_normalized],
12 inputs=[external_inputs], # footshock, pupil, speed

9 frames
/usr/local/lib/python3.11/dist-packages/ssm/util.py in wrapper(self, datas, inputs, masks, tags, **kwargs)
109 tags = [tags]
110
--> 111 return f(self, datas, inputs=inputs, masks=masks, tags=tags, **kwargs)
112
113 return wrapper

/usr/local/lib/python3.11/dist-packages/ssm/lds.py in fit(self, datas, inputs, masks, tags, verbose, method, variational_posterior, variational_posterior_kwargs, initialize, discrete_state_init_method, num_init_iters, num_init_restarts, **kwargs)
778 # Initialize the model parameters
779 if initialize:
--> 780 self.initialize(datas, inputs, masks, tags,
781 verbose=verbose,
782 discrete_state_init_method=discrete_state_init_method,

/usr/local/lib/python3.11/dist-packages/ssm/util.py in wrapper(self, datas, inputs, masks, tags, **kwargs)
109 tags = [tags]
110
--> 111 return f(self, datas, inputs=inputs, masks=masks, tags=tags, **kwargs)
112
113 return wrapper

/usr/local/lib/python3.11/dist-packages/ssm/lds.py in initialize(self, datas, inputs, masks, tags, verbose, num_init_iters, discrete_state_init_method, num_init_restarts)
165 num_init_restarts=1):
166 # First initialize the observation model
--> 167 self.emissions.initialize(datas, inputs, masks, tags)
168
169 # Get the initialized variational mean for the data

/usr/local/lib/python3.11/dist-packages/ssm/util.py in wrapper(self, datas, inputs, masks, tags, **kwargs)
109 tags = [tags]
110
--> 111 return f(self, datas, inputs=inputs, masks=masks, tags=tags, **kwargs)
112
113 return wrapper

/usr/local/lib/python3.11/dist-packages/ssm/emissions.py in initialize(self, datas, inputs, masks, tags)
447 def initialize(self, datas, inputs=None, masks=None, tags=None):
448 datas = [interpolate_data(data, mask) for data, mask in zip(datas, masks)]
--> 449 pca = self.initialize_with_pca(datas, inputs=inputs, masks=masks, tags=tags)
450 self.inv_etas[:,...] = np.log(pca.noise_variance
)
451

/usr/local/lib/python3.11/dist-packages/ssm/util.py in wrapper(self, datas, inputs, masks, tags, **kwargs)
109 tags = [tags]
110
--> 111 return f(self, datas, inputs=inputs, masks=masks, tags=tags, **kwargs)
112
113 return wrapper

/usr/local/lib/python3.11/dist-packages/ssm/emissions.py in _initialize_with_pca(self, datas, inputs, masks, tags, num_iters)
186
187 # Compute residual after accounting for input
--> 188 resids = [data - np.dot(input, self.Fs[0].T) for data, input in zip(datas, inputs)]
189
190 # Run PCA to get a linear embedding of the data with the maximum effective dimension

/usr/local/lib/python3.11/dist-packages/ssm/emissions.py in (.0)
186
187 # Compute residual after accounting for input
--> 188 resids = [data - np.dot(input, self.Fs[0].T) for data, input in zip(datas, inputs)]
189
190 # Run PCA to get a linear embedding of the data with the maximum effective dimension

/usr/local/lib/python3.11/dist-packages/autograd/tracer.py in f_wrapped(*args, **kwargs)
46 return new_box(ans, trace, node)
47 else:
---> 48 return f_raw(*args, **kwargs)
49 f_wrapped.fun = f_raw
50 f_wrapped._is_autograd_primitive = True

ValueError: shapes (128418,3) and (0,124) not aligned: 3 (dim 1) != 0 (dim 0)`

this is my code snippet:
`import ssm
rslds_recurrent_inputs = ssm.SLDS(
D_obs, K, D_latent,
transitions="recurrent", # uses past neural activity + external inputs
dynamics="gaussian", # linear latent dynamics with Gaussian noise
emissions="gaussian_orthog", # observation model
single_subspace=True
)

elbos_recurrent, _ = rslds_recurrent_inputs.fit(
[firing_rates_normalized],
inputs=[external_inputs], # footshock, pupil, speed
num_iters=10
)
`

I expect the model to correctly process both neuronal firing rates and external inputs, allowing recurrent transitions to consider both past neural activity and external inputs.Is there a specific way I need to format inputs for recurrent transitions? Could this be due to how inputs is being handled during model initialization? Any suggestions on how to properly include both firing_rates_normalized and external_inputs in ssm.SLDS.fit()?

Thank you !!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant