- Poisson generalized linear model (GLM) with regularization.
- Unlike Glmnet toolbox (https://glmnet.stanford.edu/articles/glmnet.html) or built-in LASSOGLM function, this will have different lambda values for each variable type, and will search in the n_variable-D grid space.
- Implemented higher-order Tikhonov regularization. This could be useful for smooth parameters.
n_sample = 3600; n_var = 30;
c = -1;
beta1 = 0.2 * sin(linspace(0, pi, n_var))';
beta2 = 0.2 * cos(linspace(0, 4*pi, n_var))';
X_1 = randn(n_sample, n_var);
X_2 = randn(n_sample, n_var);
mu = [X_1, X_2] * [beta1; beta2] + c;
y = poissrnd(exp(mu), n_sample, 1);
% Zeroth-order Tikhonov regularization
out0 = glm.cvglm({X_1, X_2}, y);
% First-order Tikhonov regularization
out1 = glm.cvglm({X_1, X_2}, y, 'order', 1);
% Second-order Tikhonov regularization
out2 = glm.cvglm({X_1, X_2}, y, 'order', 2);
fig = figure(1); clf;
subplot(121); hold on;
plot(beta1, 'k');
plot(out0.w{1}, ':r');
plot(out1.w{1}, ':g');
plot(out2.w{1}, ':b');
subplot(122); hold on;
plot(beta2, 'k');
plot(out0.w{2}, ':r');
plot(out1.w{2}, ':g');
plot(out2.w{2}, ':b');
Firing rate at time
Or we can directly model spike number per time bin
Time bin size
Distributivity and associativity can be applied to convolution.
Therefore,
$$
r(t) = e^{ (\mathbf{k_1} \ast \mathbf{x})(t) w_1 + (\mathbf{k_2} \ast \mathbf{x})(t) w_2 + ( \mathbf{k_3} \ast \mathbf{x})(t) w_3 + c }
$$
where
The probability that we get certain number of spikes ($y(t)$) for each time bin with expected spike count ($n(t)$) is calculated like this,
Then we sum every time bin,
where
Gradient for
Hessian for
We want to minimize penalized negative log-likelihood.
$$ \min_{w}{ -log(p) + \sum_{i} \frac{1}{2} \lambda_{i} \left| \mathbf{L}i \mathbf{w}{i} \right|_{2}^2} $$
where the matrix
Common choices for regularization operator is the identity matrix (=zeroth-order Tikhonov regularization=ridge regression). For smooth weights, scaled finite derivative matrices could be used, such as