-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add quantum convolutional neural network (QCNN) #6
Conversation
…ERA for ease of use in QCNN.
Currently, the pooling layer considers two adjacent qubits and applies a dynamic circuit. It measures one of the qubits and a controlled phase gate on the other. However, as given in the reference paper, this measurement can also be performed on two adjacent qubits. Two qubits are measured, and corresponding controlled phase gates are applied to the qubit. This flexibility in the number of qubits to be measured is yet to be implemented. This feature (perhaps) corresponds to the size of strides in a classical pooling layer. |
|
||
return self.circuit.decompose(), list(self.circuit.qubits) | ||
return self.circuit.compose( | ||
method(self.complex_structure), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Conv layer or the MERA structure still doesn't take into consideration only the unmeasured qubits. If applied more than once, a conv layer is built on all the qubits and not just the unmeasured ones. This needs to be corrected.
The decision to make in-place changes to the circuit while applying different layers is made because dynamic circuits ( Refer: Raises CircuitError |
The |
Searching into primitives - primitives currently do not support dynamic circuits. In order to apply these mid-circuit measurements, a simple |
The QCNN structure consists of 4 layers: data embedding, convolutional layer, pooling layer, and fully connected layer, followed by measurement and training.
- [ ] Data encoding integration(separate PR)- [ ] Training integration(separate PR)