Skip to content

Commit

Permalink
Bump PyO3 and rust-numpy to 0.23.x (#13577)
Browse files Browse the repository at this point in the history
* Bump PyO3 and rust-numpy to 0.23.x

This commit bumps the version of pyo3 and rust-numpy used by qiskit to
the latest release 0.23. The largest change by volume of code is the
deprecation of all the `*_bound()` methods. These are just warnings but
they would be fatal to our CI so it needs to be updated. THe larger
functional change that required updating the code is the change in the
traits around converting to Python objects. This actually found a bug in
the target where we were not returning a proper instruction type for
standard gates. This also opens up the opportunity to update our
hashbrown version to 0.15.x, but we can't do that until the next
rustworkx-core release.

* Fix cache pygates build

* Fix cargo test

* Fix impl of IntoPyObject for target's operation

* Fix impl of IntoPyObject for equivalence library circuit

* Remove commented out code

* Fix commutation library conversion

* Fix conversion of qasm3 py register type

* Use into_py_any()

* Use borrowed instead of cloning py bound for NormalOperation

* Fix unitary synthesis failure

In pyo3 0.23 the behavior of the default conversion for a `SmallVec<u8>`
and similar arrays of u8 default to bytes objects when converted to
python. This was documented as an api change in the release, however it
was cauing the unitary synthesis test to fail because when we were
evaluating whether the synthesis was in terms of the natural direction
of the 2q gate on the backend it was evaluating `[0, 1] == b"\x00\x11"`
which evaluates to `False` and the pass flipped the direction of the
2q gate. This was causing the test failure because all the 2q gates
which were correctly directed were getting incorrectly flipped. This
commit fixes this by manually creating a pylist so the comparisons work
as expected.

* Fix cargo fmt

* Apply suggestions from code review

Co-authored-by: Kevin Hartman <[email protected]>

* Fix compilation error

* Use into_py_any in missed spot

* Fix lint

* Fix cargo fmt

* Remove overeager trait derives

* Pull in latest pyo3 bugfix release

* Fix error handling

* Fix lifetime for _to_matrix()

* Implement `IntoPyObject` for `&StandardGate`

This makes it more ergonomic to use the blanket implementations of
`IntoPyObject` on ad-hoc structures like `(&T1, &T2)` when one of the
contained types is a `StandardGate`.

* Implement `IntoPyObject` for all `&T` where `T: PyClass + Copy`

---------

Co-authored-by: Kevin Hartman <[email protected]>
Co-authored-by: Jake Lishman <[email protected]>
  • Loading branch information
3 people authored Jan 21, 2025
1 parent d884a3c commit 9ddb6e2
Show file tree
Hide file tree
Showing 63 changed files with 1,220 additions and 1,072 deletions.
36 changes: 21 additions & 15 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ hashbrown.version = "0.14.5"
num-bigint = "0.4"
num-complex = "0.4"
ndarray = "0.15"
numpy = "0.22.1"
numpy = "0.23"
smallvec = "1.13"
thiserror = "1.0"
rustworkx-core = "0.15"
Expand All @@ -34,7 +34,7 @@ rayon = "1.10"
# distributions). We only activate that feature when building the C extension module; we still need
# it disabled for Rust-only tests to avoid linker errors with it not being loaded. See
# https://pyo3.rs/main/features#extension-module for more.
pyo3 = { version = "0.22.6", features = ["abi3-py39"] }
pyo3 = { version = "0.23", features = ["abi3-py39"] }

# These are our own crates.
qiskit-accelerate = { path = "crates/accelerate" }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ pub(super) fn compose_transforms<'a>(
for (node, params) in nodes_to_replace {
let param_mapping: HashMap<ParameterUuid, Param> = equiv_params
.iter()
.map(|x| ParameterUuid::from_parameter(x.to_object(py).bind(py)))
.map(|x| ParameterUuid::from_parameter(&x.into_pyobject(py).unwrap()))
.zip(params)
.map(|(uuid, param)| -> PyResult<(ParameterUuid, Param)> {
Ok((uuid?, param.clone_ref(py)))
Expand Down
18 changes: 9 additions & 9 deletions crates/accelerate/src/basis/basis_translator/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -223,7 +223,7 @@ fn extract_basis(
unreachable!("Control flow operation is not an instance of PyInstruction.")
};
let inst_bound = inst.instruction.bind(py);
for block in inst_bound.getattr("blocks")?.iter()? {
for block in inst_bound.getattr("blocks")?.try_iter()? {
recurse_circuit(py, block?, basis, min_qubits)?;
}
}
Expand Down Expand Up @@ -254,7 +254,7 @@ fn extract_basis(
if inst.op.control_flow() {
let operation_ob = instruction_object.getattr(intern!(py, "operation"))?;
let blocks = operation_ob.getattr("blocks")?;
for block in blocks.iter()? {
for block in blocks.try_iter()? {
recurse_circuit(py, block?, basis, min_qubits)?;
}
}
Expand Down Expand Up @@ -324,7 +324,7 @@ fn extract_basis_target(
let bound_inst = op.instruction.bind(py);
// Use python side extraction instead of the Rust method `op.blocks` due to
// required usage of a python-space method `QuantumCircuit.has_calibration_for`.
let blocks = bound_inst.getattr("blocks")?.iter()?;
let blocks = bound_inst.getattr("blocks")?.try_iter()?;
for block in blocks {
extract_basis_target_circ(
&block?,
Expand Down Expand Up @@ -401,7 +401,7 @@ fn extract_basis_target_circ(
unreachable!("Control flow op is not a control flow op. But control_flow is `true`")
};
let bound_inst = op.instruction.bind(py);
let blocks = bound_inst.getattr("blocks")?.iter()?;
let blocks = bound_inst.getattr("blocks")?.try_iter()?;
for block in blocks {
extract_basis_target_circ(
&block?,
Expand Down Expand Up @@ -441,7 +441,7 @@ fn apply_translation(
let mut flow_blocks = vec![];
let bound_obj = control_op.instruction.bind(py);
let blocks = bound_obj.getattr("blocks")?;
for block in blocks.iter()? {
for block in blocks.try_iter()? {
let block = block?;
let dag_block: DAGCircuit =
circuit_to_dag(py, block.extract()?, true, None, None)?;
Expand Down Expand Up @@ -665,7 +665,7 @@ fn replace_node(
let parameter_map = target_params
.iter()
.zip(node.params_view())
.into_py_dict_bound(py);
.into_py_dict(py)?;
for inner_index in target_dag.topological_op_nodes()? {
let inner_node = &target_dag[inner_index].unwrap_operation();
let old_qargs = dag.get_qargs(node.qubits);
Expand Down Expand Up @@ -700,7 +700,7 @@ fn replace_node(
if let Param::ParameterExpression(param_obj) = param {
let bound_param = param_obj.bind(py);
let exp_params = param.iter_parameters(py)?;
let bind_dict = PyDict::new_bound(py);
let bind_dict = PyDict::new(py);
for key in exp_params {
let key = key?;
bind_dict.set_item(&key, parameter_map.get_item(&key)?)?;
Expand Down Expand Up @@ -767,7 +767,7 @@ fn replace_node(

if let Param::ParameterExpression(old_phase) = target_dag.global_phase() {
let bound_old_phase = old_phase.bind(py);
let bind_dict = PyDict::new_bound(py);
let bind_dict = PyDict::new(py);
for key in target_dag.global_phase().iter_parameters(py)? {
let key = key?;
bind_dict.set_item(&key, parameter_map.get_item(&key)?)?;
Expand All @@ -788,7 +788,7 @@ fn replace_node(
}
if !new_phase.getattr(intern!(py, "parameters"))?.is_truthy()? {
new_phase = new_phase.call_method0(intern!(py, "numeric"))?;
if new_phase.is_instance(&PyComplex::type_object_bound(py))? {
if new_phase.is_instance(&PyComplex::type_object(py))? {
return Err(TranspilerError::new_err(format!(
"Global phase must be real, but got {}",
new_phase.repr()?
Expand Down
2 changes: 1 addition & 1 deletion crates/accelerate/src/check_map.rs
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ fn recurse<'py>(
if let OperationRef::Instruction(py_inst) = inst.op.view() {
let raw_blocks = py_inst.instruction.getattr(py, "blocks")?;
let circuit_to_dag = CIRCUIT_TO_DAG.get_bound(py);
for raw_block in raw_blocks.bind(py).iter().unwrap() {
for raw_block in raw_blocks.bind(py).try_iter()? {
let block_obj = raw_block?;
let block = block_obj
.getattr(intern!(py, "_data"))?
Expand Down
8 changes: 3 additions & 5 deletions crates/accelerate/src/circuit_library/blocks.rs
Original file line number Diff line number Diff line change
Expand Up @@ -42,16 +42,15 @@ impl BlockOperation {
)),
Self::PyCustom { builder } => {
// the builder returns a Python operation plus the bound parameters
let py_params =
PyList::new_bound(py, params.iter().map(|&p| p.clone().into_py(py))).into_any();
let py_params = PyList::new(py, params.iter().map(|&p| p.clone()))?.into_any();

let job = builder.call1(py, (py_params,))?;
let result = job.downcast_bound::<PyTuple>(py)?;

let operation: OperationFromPython = result.get_item(0)?.extract()?;
let bound_params = result
.get_item(1)?
.iter()?
.try_iter()?
.map(|ob| Param::extract_no_coerce(&ob?))
.collect::<PyResult<SmallVec<[Param; 3]>>>()?;

Expand Down Expand Up @@ -84,7 +83,6 @@ impl Block {
#[staticmethod]
#[pyo3(signature = (num_qubits, num_parameters, builder,))]
pub fn from_callable(
py: Python,
num_qubits: i64,
num_parameters: i64,
builder: &Bound<PyAny>,
Expand All @@ -96,7 +94,7 @@ impl Block {
}
let block = Block {
operation: BlockOperation::PyCustom {
builder: builder.to_object(py),
builder: builder.clone().unbind(),
},
num_qubits: num_qubits as u32,
num_parameters: num_parameters as usize,
Expand Down
2 changes: 1 addition & 1 deletion crates/accelerate/src/circuit_library/entanglement.rs
Original file line number Diff line number Diff line change
Expand Up @@ -247,7 +247,7 @@ pub fn get_entangler_map<'py>(
Ok(entanglement) => entanglement
.into_iter()
.map(|vec| match vec {
Ok(vec) => Ok(PyTuple::new_bound(py, vec)),
Ok(vec) => PyTuple::new(py, vec),
Err(e) => Err(e),
})
.collect::<Result<Vec<_>, _>>(),
Expand Down
4 changes: 2 additions & 2 deletions crates/accelerate/src/circuit_library/parameter_ledger.rs
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ pub(super) type LayerParameters<'a> = Vec<BlockParameters<'a>>; // parameter in
/// Internally, the parameters are stored in a 1-D vector and the ledger keeps track of
/// which indices belong to which layer. For example, a 2-qubit circuit where both the
/// rotation and entanglement layer have 1 block with 2 parameters each, we would store
///
///
/// [x0 x1 x2 x3 x4 x5 x6 x7 ....]
/// ----- ----- ----- -----
/// rep0 rep0 rep1 rep2
Expand Down Expand Up @@ -105,7 +105,7 @@ impl ParameterLedger {
let parameter_vector: Vec<Param> = imports::PARAMETER_VECTOR
.get_bound(py)
.call1((parameter_prefix, num_parameters))? // get the Python ParameterVector
.iter()? // iterate over the elements and cast them to Rust Params
.try_iter()? // iterate over the elements and cast them to Rust Params
.map(|ob| Param::extract_no_coerce(&ob?))
.collect::<PyResult<_>>()?;

Expand Down
4 changes: 2 additions & 2 deletions crates/accelerate/src/circuit_library/pauli_feature_map.rs
Original file line number Diff line number Diff line change
Expand Up @@ -67,12 +67,12 @@ pub fn pauli_feature_map(
let pauli_strings = _get_paulis(feature_dimension, paulis)?;

// set the default value for entanglement
let default = PyString::new_bound(py, "full");
let default = PyString::new(py, "full");
let entanglement = entanglement.unwrap_or(&default);

// extract the parameters from the input variable ``parameters``
let parameter_vector = parameters
.iter()?
.try_iter()?
.map(|el| Param::extract_no_coerce(&el?))
.collect::<PyResult<Vec<Param>>>()?;

Expand Down
6 changes: 3 additions & 3 deletions crates/accelerate/src/circuit_library/quantum_volume.rs
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ pub fn quantum_volume(
let num_unitaries = width * depth;
let mut permutation: Vec<Qubit> = (0..num_qubits).map(Qubit).collect();

let kwargs = PyDict::new_bound(py);
let kwargs = PyDict::new(py);
kwargs.set_item(intern!(py, "num_qubits"), 2)?;
let mut build_instruction = |(unitary_index, unitary_array): (usize, Array2<Complex64>),
rng: &mut Pcg64Mcg|
Expand All @@ -122,7 +122,7 @@ pub fn quantum_volume(
if layer_index == 0 {
permutation.shuffle(rng);
}
let unitary = unitary_array.into_pyarray_bound(py);
let unitary = unitary_array.into_pyarray(py);

let unitary_gate = UNITARY_GATE
.get_bound(py)
Expand All @@ -137,7 +137,7 @@ pub fn quantum_volume(
let qubit = layer_index * 2;
Ok((
PackedOperation::from_gate(Box::new(instruction)),
smallvec![Param::Obj(unitary.unbind().into())],
smallvec![Param::Obj(unitary.into_any().unbind())],
vec![permutation[qubit], permutation[qubit + 1]],
vec![],
))
Expand Down
17 changes: 9 additions & 8 deletions crates/accelerate/src/commutation_analysis.rs
Original file line number Diff line number Diff line change
Expand Up @@ -146,39 +146,40 @@ pub(crate) fn analyze_commutations(
// The Python dict will store both of these dictionaries in one.
let (commutation_set, node_indices) = analyze_commutations_inner(py, dag, commutation_checker)?;

let out_dict = PyDict::new_bound(py);
let out_dict = PyDict::new(py);

// First set the {wire: [commuting_nodes_1, ...]} bit
for (wire, commutations) in commutation_set {
// we know all wires are of type Wire::Qubit, since in analyze_commutations_inner
// we only iterater over the qubits
let py_wire = match wire {
Wire::Qubit(q) => dag.qubits().get(q).unwrap().to_object(py),
Wire::Qubit(q) => dag.qubits().get(q).unwrap().into_pyobject(py),
_ => return Err(PyValueError::new_err("Unexpected wire type.")),
};
}?;

out_dict.set_item(
py_wire,
PyList::new_bound(
PyList::new(
py,
commutations.iter().map(|inner| {
PyList::new_bound(
PyList::new(
py,
inner
.iter()
.map(|node_index| dag.get_node(py, *node_index).unwrap()),
)
.unwrap()
}),
),
)?,
)?;
}

// Then we add the {(node, wire): index} dictionary
for ((node_index, wire), index) in node_indices {
let py_wire = match wire {
Wire::Qubit(q) => dag.qubits().get(q).unwrap().to_object(py),
Wire::Qubit(q) => dag.qubits().get(q).unwrap().into_pyobject(py),
_ => return Err(PyValueError::new_err("Unexpected wire type.")),
};
}?;
out_dict.set_item((dag.get_node(py, node_index)?, py_wire), index)?;
}

Expand Down
Loading

1 comment on commit 9ddb6e2

@Magikaaarp
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello,
I recently encountered a build issue with the latest version of the project, specifically when running cargo build. The build process fails in the current version but succeeds in the parent commit (d884a3c).
Upon investigation, I found that the issue is related to dependency conflicts. numpy 0.23 requires ndarray 0.16.1, which is incompatible with other dependencies in the project (ndarray 0.15.6). Despite my efforts, I have been unable to resolve this conflict.
I want to mention that I am not entirely certain if my understanding of the issue is correct, as I am still learning about rust. However, given that the previous version did not have this problem, I thought it might be worth bringing this to your attention.
I would greatly appreciate any insights or assistance you could provide to help resolve this issue.

Please sign in to comment.