Qtransformer
qTransformer is a quantum circuit neural network classifier based on chained quantum attention mechanism layers, residual layers, and feed-forward neural network layers. Attention is computed using a trainable, parameterized general two-body interaction between word embedding statevectors in a sentence system; residual connections are represented using CX-gates; feed-forward neural network layers are represented using custom variational ansatzes such as RealAmplitudes and EfficientSU2.
Install / Use
/learn @areeq-hasan/QtransformerREADME
qTransformer
qTransformer is a quantum circuit neural network classifier based on chained quantum attention mechanism layers, residual layers, and feed-forward neural network layers. Attention is computed using a trainable, parameterized general two-body interaction between word embedding statevectors in a sentence system; residual connections are represented using CX-gates; feed-forward neural network layers are represented using custom variational ansatzes such as RealAmplitudes and EfficientSU2.
The construction for the circuit representing the model is as follows:
# ------------------------------------------------------------------------------
# Construct Circuit.
# Feature map encodes the word embedding vectors.
feature_map = PauliFeatureMap(n)
# Attention transforms the sentence system to contain information regarding how
# much each word qubit should attend to every other word qubit.
if include_attention:
attention_layers = [attention(n, l=i) for i in range(reps)]
# Residual adds information regarding the original word emebddings back to the
# post-attention sentence system.
if include_residual:
residual_layers = [residual(n) for _ in range(reps)]
# Ansatz serves as a feed-forward layer at the end of the transformer block.
feed_forward_layers = [
EfficientSU2(n, parameter_prefix=str(i)) for i in range(reps)
]
self.circuit = QuantumCircuit(2 * n if include_residual else n)
self.circuit.append(feature_map, range(n))
if include_residual:
self.circuit.append(feature_map, range(n, 2 * n))
for i in range(reps):
if include_attention:
self.circuit.append(attention_layers[i], range(n))
if include_residual:
self.circuit.append(residual_layers[i], range(2 * n))
self.circuit.append(feed_forward_layers[i], range(n))
# ------------------------------------------------------------------------------
The wrapper class for the qTransformer lives in /qtransformer/__init__.py.
Related Skills
node-connect
351.8kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
110.9kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
351.8kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
351.8kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
