Skip to content

Commit 3ab870d

Browse files
committed
Refactor hardware integration to top-level provider layout
1 parent d791710 commit 3ab870d

28 files changed

+313
-218
lines changed

README.md

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -237,13 +237,19 @@ recommended data transport, and adapter API.
237237
Xanadu-style job conversion example:
238238

239239
```bash
240-
bash examples/hardware_integration/run.sh
240+
bash hardware_integration/xanadu/run.sh
241241
```
242242

243243
Aurora/QCA/GKP fixture conversions:
244244

245245
```bash
246-
bash examples/hardware_integration/run_public_datasets.sh
246+
bash hardware_integration/xanadu/run_public_datasets.sh
247+
```
248+
249+
One-command real-data slice (download + convert + replay):
250+
251+
```bash
252+
bash hardware_integration/xanadu/xandau_hardware_data.sh --install-deps
247253
```
248254

249255
For large real datasets, use converter streaming/chunk flags:
@@ -264,6 +270,7 @@ Replay converted NDJSON through the C++ adapter:
264270
include/ # public headers and interfaces
265271
src/ # simulator and decoder implementations
266272
examples/ # reproducible runs and plotting scripts
273+
hardware_integration/ # provider-oriented hardware data ingestion
267274
```
268275

269276
## Release Notes

docs/hardware-integration.md

Lines changed: 29 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,9 @@ LiDMaS+ supports three data ingestion modes for hardware integration:
66
2. File batch (NDJSON)
77
3. In-process C++ adapter API
88

9+
Provider-specific examples live under `hardware_integration/<provider>/`.
10+
Current provider implementation: `hardware_integration/xanadu/`.
11+
912
## Recommended default
1013

1114
- Use **sparse time-stamped syndrome events** by default.
@@ -47,7 +50,7 @@ Supported source modes:
4750
Legacy job JSON:
4851

4952
```bash
50-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
53+
python3 hardware_integration/xanadu/convert_xanadu_job_to_decoder_io.py \
5154
--source-format xanadu_job_json \
5255
--input /path/to/xanadu_job.json \
5356
--mapping /path/to/your_mapping.json \
@@ -57,25 +60,37 @@ python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
5760
Quick demo:
5861

5962
```bash
60-
bash examples/hardware_integration/run.sh
63+
bash hardware_integration/xanadu/run.sh
6164
```
6265

6366
Aurora / QCA / GKP fixture demos:
6467

6568
```bash
66-
bash examples/hardware_integration/run_public_datasets.sh
69+
bash hardware_integration/xanadu/run_public_datasets.sh
70+
```
71+
72+
One-command real-data slices (download + convert + replay):
73+
74+
```bash
75+
bash hardware_integration/xanadu/xandau_hardware_data.sh --install-deps
76+
77+
# QCA fig3b
78+
bash hardware_integration/xanadu/xandau_hardware_data.sh \
79+
--dataset qca_fig3b \
80+
--max-shots 200000 \
81+
--install-deps
6782
```
6883

6984
Real Aurora decoder-demo batch:
7085

7186
```bash
7287
python3 -m pip install numpy
7388

74-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
89+
python3 hardware_integration/xanadu/convert_xanadu_job_to_decoder_io.py \
7590
--source-format aurora_switch_dir \
7691
--stream \
7792
--input /path/to/decoder_demo/signal/batch_0 \
78-
--mapping examples/hardware_integration/xanadu_aurora_mapping_example.json \
93+
--mapping hardware_integration/xanadu/xanadu_aurora_mapping_example.json \
7994
--out examples/results/hardware_integration/decoder_requests_aurora.ndjson \
8095
--aurora-binarize \
8196
--max-shots 20000 \
@@ -85,11 +100,11 @@ python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
85100
Real QCA sample matrix:
86101

87102
```bash
88-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
103+
python3 hardware_integration/xanadu/convert_xanadu_job_to_decoder_io.py \
89104
--source-format shot_matrix \
90105
--stream \
91106
--input /path/to/fig3a/samples.npy \
92-
--mapping examples/hardware_integration/xanadu_qca_mapping_example.json \
107+
--mapping hardware_integration/xanadu/xanadu_qca_mapping_example.json \
93108
--out examples/results/hardware_integration/decoder_requests_qca.ndjson \
94109
--max-shots 50000 \
95110
--progress-every 10000
@@ -98,20 +113,20 @@ python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
98113
Chunk large QCA files by repeating conversion with shifted `--shot-start` and `--append-out`:
99114

100115
```bash
101-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
116+
python3 hardware_integration/xanadu/convert_xanadu_job_to_decoder_io.py \
102117
--source-format shot_matrix \
103118
--stream \
104119
--input /path/to/fig3a/samples.npy \
105-
--mapping examples/hardware_integration/xanadu_qca_mapping_example.json \
120+
--mapping hardware_integration/xanadu/xanadu_qca_mapping_example.json \
106121
--out examples/results/hardware_integration/decoder_requests_qca.ndjson \
107122
--shot-start 0 \
108123
--max-shots 200000
109124

110-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
125+
python3 hardware_integration/xanadu/convert_xanadu_job_to_decoder_io.py \
111126
--source-format shot_matrix \
112127
--stream \
113128
--input /path/to/fig3a/samples.npy \
114-
--mapping examples/hardware_integration/xanadu_qca_mapping_example.json \
129+
--mapping hardware_integration/xanadu/xanadu_qca_mapping_example.json \
115130
--out examples/results/hardware_integration/decoder_requests_qca.ndjson \
116131
--append-out \
117132
--shot-start 200000 \
@@ -121,15 +136,15 @@ python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
121136
Count-compressed GKP outcomes:
122137

123138
```bash
124-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
139+
python3 hardware_integration/xanadu/convert_xanadu_job_to_decoder_io.py \
125140
--source-format count_table_json \
126141
--input /path/to/gkp_outcome_counts.json \
127-
--mapping examples/hardware_integration/xanadu_gkp_mapping_example.json \
142+
--mapping hardware_integration/xanadu/xanadu_gkp_mapping_example.json \
128143
--out examples/results/hardware_integration/decoder_requests_gkp.ndjson
129144
```
130145

131146
The mapping file controls how measured modes are converted to syndrome events.
132-
See `examples/hardware_integration/xanadu_syndrome_mapping_example.json`.
147+
See `hardware_integration/xanadu/xanadu_syndrome_mapping_example.json`.
133148

134149
Large-data controls:
135150

examples/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ LIDMAS_SKIP_PY_DEPS=1 ./examples/hybrid_threshold/run.sh
2727
- `decoder_comparison/`: same sweep across multiple decoders.
2828
- `failure_debug/`: stress run and failure-dump capture workflow.
2929
- `plot_only/`: publication-grade plotting from existing CSV files.
30-
- `hardware_integration/`: convert Xanadu datasets (Aurora/QCA/GKP/job JSON) to LiDMaS+ decoder IO NDJSON.
30+
- `../hardware_integration/`: provider-oriented hardware integration entry point; current provider: `hardware_integration/xanadu/` (Aurora/QCA/GKP/job JSON) plus one-command real-data downloader.
3131

3232

3333
## Central Results Folder
Lines changed: 9 additions & 201 deletions
Original file line numberDiff line numberDiff line change
@@ -1,209 +1,17 @@
1-
# hardware_integration
1+
# Hardware Integration (Moved)
22

3-
This example converts Xanadu datasets to LiDMaS+ `decoder_io` NDJSON.
4-
The same converter now supports:
3+
Hardware integration entry points now live at:
54

6-
- legacy Xanadu job JSON payloads,
7-
- Aurora decoder-demo switch-setting directories,
8-
- QCA/Borealis-style shot matrices (`samples.npy`),
9-
- count-compressed outcomes (useful for GKP-style exports).
5+
- `hardware_integration/xanadu/`
106

11-
## Files
12-
13-
- `convert_xanadu_job_to_decoder_io.py`: converter script.
14-
- `xanadu_job_result_example.json`: minimal sample job payload.
15-
- `xanadu_syndrome_mapping_example.json`: mode-to-syndrome parity mapping.
16-
- `aurora_batch_example/`: tiny Aurora-style batch with `switch_settings_qpu_*.json`.
17-
- `xanadu_aurora_mapping_example.json`: Aurora mapping.
18-
- `xanadu_qca_samples_example.json`: QCA-like shot-matrix fixture.
19-
- `xanadu_qca_mapping_example.json`: QCA mapping.
20-
- `xanadu_gkp_counts_example.json`: count-compressed outcome fixture.
21-
- `xanadu_gkp_mapping_example.json`: GKP mapping.
22-
- `run.sh`: one-command local demo.
23-
- `run_aurora.sh`: Aurora conversion demo.
24-
- `run_qca.sh`: QCA conversion demo.
25-
- `run_gkp.sh`: GKP count-table conversion demo.
26-
- `run_public_datasets.sh`: run Aurora + QCA + GKP demos.
27-
- `replay.sh`: decode generated requests via `./build/lidmas --decoder_io_replay`.
28-
29-
## Quick Run
30-
31-
```bash
32-
bash examples/hardware_integration/run.sh
33-
```
34-
35-
Output:
36-
37-
- `examples/results/hardware_integration/decoder_requests.ndjson`
38-
39-
Each NDJSON line is a `DecodeRequest` compatible with `schemas/decoder_io.proto`.
40-
41-
Run all public-dataset fixtures:
42-
43-
```bash
44-
bash examples/hardware_integration/run_public_datasets.sh
45-
```
46-
47-
Outputs:
48-
49-
- `examples/results/hardware_integration/decoder_requests_aurora.ndjson`
50-
- `examples/results/hardware_integration/decoder_requests_qca.ndjson`
51-
- `examples/results/hardware_integration/decoder_requests_gkp.ndjson`
52-
53-
Replay those requests through the C++ adapter:
54-
55-
```bash
56-
./build/lidmas --decoder_io_replay \
57-
--decoder_io_in=examples/results/hardware_integration/decoder_requests.ndjson \
58-
--decoder_io_out=examples/results/hardware_integration/decoder_responses.ndjson \
59-
--decoder_io_config=schemas/surface_decoder_adapter_config.json \
60-
--decoder_io_continue_on_error
61-
```
62-
63-
Helper script (auto-derives response filename):
64-
65-
```bash
66-
bash examples/hardware_integration/replay.sh \
67-
examples/results/hardware_integration/decoder_requests_aurora.ndjson
68-
```
69-
70-
## Real Xanadu Job Data
71-
72-
### Aurora decoder_demo batch (switch settings)
73-
74-
Requires NumPy when reading `.npy` files:
75-
76-
```bash
77-
python3 -m pip install numpy
78-
```
79-
80-
Convert one Aurora batch directory:
81-
82-
```bash
83-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
84-
--source-format aurora_switch_dir \
85-
--input /path/to/decoder_demo/signal/batch_0 \
86-
--mapping examples/hardware_integration/xanadu_aurora_mapping_example.json \
87-
--out examples/results/hardware_integration/decoder_requests_aurora.ndjson \
88-
--aurora-binarize \
89-
--max-shots 20000 \
90-
--meta hardware=xanadu \
91-
--meta dataset=aurora_decoder_demo
92-
```
93-
94-
For large Aurora batches, enable streaming and progress logs:
95-
96-
```bash
97-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
98-
--source-format aurora_switch_dir \
99-
--stream \
100-
--input /path/to/decoder_demo/signal/batch_0 \
101-
--mapping examples/hardware_integration/xanadu_aurora_mapping_example.json \
102-
--out examples/results/hardware_integration/decoder_requests_aurora.ndjson \
103-
--aurora-binarize \
104-
--max-shots 200000 \
105-
--progress-every 50000
106-
```
107-
108-
### QCA / Borealis samples.npy
109-
110-
Convert `samples.npy` (shape typically `n x 1 x M`):
111-
112-
```bash
113-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
114-
--source-format shot_matrix \
115-
--stream \
116-
--input /path/to/fig3a/samples.npy \
117-
--mapping examples/hardware_integration/xanadu_qca_mapping_example.json \
118-
--out examples/results/hardware_integration/decoder_requests_qca.ndjson \
119-
--max-shots 50000 \
120-
--meta hardware=xanadu \
121-
--meta dataset=qca
122-
```
123-
124-
Chunked QCA conversion (memory-safe over very large files):
125-
126-
```bash
127-
# chunk 1
128-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
129-
--source-format shot_matrix \
130-
--stream \
131-
--input /path/to/fig3a/samples.npy \
132-
--mapping examples/hardware_integration/xanadu_qca_mapping_example.json \
133-
--out examples/results/hardware_integration/decoder_requests_qca.ndjson \
134-
--shot-start 0 \
135-
--max-shots 200000 \
136-
--progress-every 50000
137-
138-
# chunk 2 (append)
139-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
140-
--source-format shot_matrix \
141-
--stream \
142-
--input /path/to/fig3a/samples.npy \
143-
--mapping examples/hardware_integration/xanadu_qca_mapping_example.json \
144-
--out examples/results/hardware_integration/decoder_requests_qca.ndjson \
145-
--append-out \
146-
--shot-start 200000 \
147-
--max-shots 200000 \
148-
--progress-every 50000
149-
```
150-
151-
### GKP outcome counts
152-
153-
For count-compressed outcomes exported from your analysis notebook:
7+
Run from repo root:
1548

1559
```bash
156-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
157-
--source-format count_table_json \
158-
--input /path/to/gkp_outcome_counts.json \
159-
--mapping examples/hardware_integration/xanadu_gkp_mapping_example.json \
160-
--out examples/results/hardware_integration/decoder_requests_gkp.ndjson \
161-
--max-shots 100000 \
162-
--meta hardware=xanadu \
163-
--meta dataset=gkp
164-
```
165-
166-
`gkp_outcome_counts.json` entries should look like:
167-
168-
```json
169-
{"counts":[{"sample":[0,1,0],"count":12},{"sample":[1,0,1],"count":5}]}
10+
bash hardware_integration/xanadu/run.sh
11+
bash hardware_integration/xanadu/run_public_datasets.sh
12+
bash hardware_integration/xanadu/xandau_hardware_data.sh --install-deps
17013
```
17114

172-
### Legacy Xanadu job JSON
173-
174-
Use the converter with exported job JSON payloads:
175-
176-
```bash
177-
python3 examples/hardware_integration/convert_xanadu_job_to_decoder_io.py \
178-
--input /path/to/xanadu_job.json \
179-
--mapping /path/to/your_mapping.json \
180-
--out examples/results/hardware_integration/decoder_requests.ndjson \
181-
--sigma 0.18 \
182-
--gate-error-rate 0.0007 \
183-
--meas-error-rate 0.0009 \
184-
--idle-error-rate 0.0003 \
185-
--meta hardware=xanadu \
186-
--meta backend=X8_01
187-
```
188-
189-
## Mapping Notes
190-
191-
`stabilizers` entries define syndrome-event generation by parity:
192-
193-
- `index`: stabilizer index in LiDMaS+ space.
194-
- `type`: `X`, `Z`, or `UNKNOWN`.
195-
- `modes`: list of measured-mode indices used for parity.
196-
- `mod`: modulus (default 2).
197-
- `trigger_on`: event if `(sum(modes) % mod) == trigger_on` (default 1).
198-
- `time_offset_ns`: optional per-stabilizer event timestamp offset.
199-
200-
This mapping is hardware- and experiment-specific.
201-
Illustrative mappings may produce non-physical syndromes; use `--decoder_io_continue_on_error` during early integration.
202-
203-
## Large-Data Flags
15+
Outputs remain in:
20416

205-
- `--stream`: uses NumPy memory-mapped loading when available (`.npy`), avoiding full in-memory expansion.
206-
- `--shot-start N`: skip the first `N` expanded shots before writing.
207-
- `--max-shots K`: write at most `K` shots this run.
208-
- `--append-out`: append NDJSON to an existing output file.
209-
- `--progress-every M`: emit progress every `M` written shots to stderr.
17+
- `examples/results/hardware_integration/`

hardware_integration/README.md

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
# Hardware Integration
2+
3+
Provider-oriented hardware integration entry point for LiDMaS+.
4+
5+
## Providers
6+
7+
- `xanadu/`: converters, mappings, demos, and real-data download/replay scripts for Aurora/QCA/GKP workflows.
8+
9+
## Quick Start
10+
11+
```bash
12+
bash hardware_integration/xanadu/run.sh
13+
bash hardware_integration/xanadu/run_public_datasets.sh
14+
bash hardware_integration/xanadu/xandau_hardware_data.sh --install-deps
15+
```
16+
17+
Outputs stay centralized in:
18+
19+
- `examples/results/hardware_integration/`

0 commit comments

Comments
 (0)