Skip to content

Conversation

@marcorudolphflex
Copy link
Contributor

@marcorudolphflex marcorudolphflex commented Oct 7, 2025

Summary

Add opt-in local cache for simulation results hashed by simulation + runtime context

Background

Every repeated run of the same simulation re-uploads/downloads data. We want to cache the resulting .hdf5 locally (opt-in) and reuse it on identical runs. Cache key must go beyond _hash_self() because solver version, workflow path, environment, and runtime options also influence the artifact we return.

Greptile Overview

Updated On: 2025-10-07 11:35:12 UTC

Summary

This PR introduces a comprehensive opt-in local caching system for Tidy3D simulation results to eliminate redundant uploads and downloads when running identical simulations. The cache system stores simulation result HDF5 files locally using a composite cache key that combines simulation hash with runtime context (solver version, workflow type, environment variables) to ensure cache validity beyond simple simulation parameters.

The implementation centers around a new SimulationCache class in tidy3d/web/cache.py that provides thread-safe LRU caching with configurable size and entry limits. The cache integrates seamlessly into the existing web API workflow by intercepting run() calls early to check for cached results and storing successful simulation outputs in the load() function. Cache entries are validated using SHA256 checksums and support atomic file operations to prevent corruption during concurrent access.

Configuration is handled through a new SimulationCacheSettings class in config.py with sensible defaults (disabled by default, 10GB max size, 128 max entries, ~/.tidy3d/cache/simulations directory). The feature is exposed through an optional use_cache parameter across all API entry points (run(), run_async(), autograd functions) allowing per-call override of global cache settings.

The cache system handles both individual Job objects and batch operations through Batch objects, with comprehensive error handling for edge cases like cache corruption, missing files, and network failures. The implementation follows enterprise software patterns with proper thread synchronization using a global singleton pattern and LRU eviction policies to manage cache size.

Important Files Changed

Changed Files
Filename Score Overview
tidy3d/web/cache.py 4/5 New comprehensive cache implementation with thread-safe LRU caching, atomic operations, and checksum verification
tidy3d/web/api/container.py 4/5 Core integration of cache logic into Job/Batch classes with cache-aware workflow methods
tidy3d/web/api/webapi.py 4/5 Integration of cache lookup/storage into main run() and load() functions with proper error handling
tidy3d/config.py 4/5 New SimulationCacheSettings configuration class with validation and sensible defaults
tests/test_web/test_simulation_cache.py 4/5 Comprehensive test coverage for cache functionality with extensive mocking of web pipeline
tidy3d/web/api/autograd/autograd.py 5/5 Clean addition of use_cache parameter to autograd-compatible run functions
tidy3d/web/api/autograd/engine.py 5/5 Simple addition of use_cache to job_fields list for proper parameter propagation
tidy3d/web/api/asynchronous.py 5/5 Addition of use_cache parameter to run_async function with proper documentation

Confidence score: 4/5

  • This PR introduces significant new functionality with good architectural design and comprehensive error handling
  • Score reflects complex caching logic that could have edge cases, but implementation follows solid engineering practices
  • Pay close attention to tidy3d/web/cache.py and tidy3d/web/api/container.py for the core caching logic and integration points

Sequence Diagram

sequenceDiagram
    participant User
    participant WebAPI as "web.run()"
    participant Cache as "SimulationCache"
    participant Job as "Job"
    participant Server as "Server"
    participant Stub as "Tidy3dStub"

    User->>WebAPI: "run(simulation, use_cache=True)"
    WebAPI->>Cache: "resolve_simulation_cache(use_cache)"
    Cache-->>WebAPI: "SimulationCache instance or None"
    
    alt Cache enabled
        WebAPI->>Cache: "try_fetch(simulation)"
        Cache->>Cache: "build_cache_key(simulation_hash, workflow_type, version)"
        Cache->>Cache: "_fetch(cache_key)"
        alt Cache hit
            Cache-->>WebAPI: "CacheEntry"
            WebAPI->>Cache: "materialize(path)"
            Cache-->>WebAPI: "Cached data path"
            WebAPI->>Stub: "postprocess(path)"
            Stub-->>WebAPI: "SimulationData"
            WebAPI-->>User: "SimulationData (from cache)"
        else Cache miss
            Cache-->>WebAPI: "None"
            WebAPI->>Job: "upload(simulation)"
            Job->>Server: "Upload simulation"
            Server-->>Job: "task_id"
            WebAPI->>Job: "start(task_id)"
            Job->>Server: "Start simulation"
            WebAPI->>Job: "monitor(task_id)"
            Job->>Server: "Poll status"
            Server-->>Job: "Status updates"
            WebAPI->>Job: "download(task_id, path)"
            Job->>Server: "Download results"
            Server-->>Job: "Simulation data file"
            WebAPI->>Stub: "postprocess(path)"
            Stub-->>WebAPI: "SimulationData"
            WebAPI->>Cache: "store_result(stub_data, task_id, path, workflow_type)"
            Cache->>Cache: "_store(cache_key, task_id, source_path, metadata)"
            WebAPI-->>User: "SimulationData"
        end
    else Cache disabled
        WebAPI->>Job: "upload(simulation)"
        Job->>Server: "Upload simulation"
        Server-->>Job: "task_id"
        WebAPI->>Job: "start(task_id)"
        Job->>Server: "Start simulation"
        WebAPI->>Job: "monitor(task_id)"
        Job->>Server: "Poll status"
        Server-->>Job: "Status updates"
        WebAPI->>Job: "download(task_id, path)"
        Job->>Server: "Download results"
        Server-->>Job: "Simulation data file"
        WebAPI->>Stub: "postprocess(path)"
        Stub-->>WebAPI: "SimulationData"
        WebAPI-->>User: "SimulationData"
    end
Loading

Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

8 files reviewed, 10 comments

Edit Code Review Agent Settings | Greptile

@github-actions
Copy link
Contributor

github-actions bot commented Oct 7, 2025

Diff Coverage

Diff: origin/develop...HEAD, staged and unstaged changes

  • tidy3d/config/loader.py (100%)
  • tidy3d/config/sections.py (96.2%): Missing lines 399
  • tidy3d/web/api/autograd/engine.py (100%)
  • tidy3d/web/api/container.py (78.7%): Missing lines 267-270,272,317,385,448,856,1308-1309,1320-1321,1326-1327,1329,1336-1337,1339,1344,1364-1365,1411
  • tidy3d/web/api/run.py (57.1%): Missing lines 239-241
  • tidy3d/web/api/webapi.py (85.5%): Missing lines 341-343,378,428,431,593,598,1394,1398
  • tidy3d/web/cache.py (82.4%): Missing lines 59,62,65-66,73-74,111-112,150,182,184-186,196-198,201-204,216,221,232-242,246,253,256-257,267-268,315-316,334,356-357,436,441,443,445,447,449,451,513-515
  • tidy3d/web/core/http_util.py (100%)

Summary

  • Total: 521 lines
  • Missing: 91 lines
  • Coverage: 82%

tidy3d/config/sections.py

  395         return (base / "cache" / "simulations").resolve()
  396     else:
  397         xdg_cache = os.getenv("XDG_CACHE_HOME")
  398         if xdg_cache:
! 399             base = Path(xdg_cache).expanduser().resolve()
  400         else:
  401             base = Path.home() / ".cache"
  402     return (base / "tidy3d" / "simulations").resolve()

tidy3d/web/api/container.py

  263         os.replace(tmp, dst_path)
  264 
  265     def clear_stash(self) -> None:
  266         """Delete this job's stash file only."""
! 267         if self._stash_path:
! 268             try:
! 269                 if os.path.exists(self._stash_path):
! 270                     os.remove(self._stash_path)
  271             finally:
! 272                 self._stash_path = None
  273 
  274     def to_file(self, fname: PathLike) -> None:
  275         """Exports :class:`Tidy3dBaseModel` instance to .yaml, .json, or .hdf5 file

  313             self.upload()
  314             if priority is None:
  315                 self.start()
  316             else:
! 317                 self.start(priority=priority)
  318             self.monitor()
  319         data = self.load(path=path)
  320 
  321         return data

  381     @property
  382     def status(self):
  383         """Return current status of :class:`Job`."""
  384         if self.load_if_cached:
! 385             return "success"
  386         if web._is_modeler_batch(self.task_id):
  387             detail = self.get_info()
  388             status = detail.totalStatus.value
  389             return status

  444         To load the output of completed simulation into :class:`.SimulationData` objects,
  445         call :meth:`Job.load`.
  446         """
  447         if self.load_if_cached:
! 448             return
  449         web.monitor(self.task_id, verbose=self.verbose)
  450 
  451     def download(self, path: PathLike = DEFAULT_DATA_PATH) -> None:
  452         """Download results of simulation.

  852             self.to_file(self._batch_path(path_dir=path_dir))
  853             if priority is None:
  854                 self.start()
  855             else:
! 856                 self.start(priority=priority)
  857             self.monitor(path_dir=path_dir, download_on_success=True)
  858         else:
  859             console = get_logging_console()
  860             console.log("Found all simulations in cache.")

  1304                 os.path.exists(self._job_data_path(task_id=job.task_id, path_dir=path_dir))
  1305                 for job in self.jobs.values()
  1306             )
  1307             if num_existing > 0:
! 1308                 files_plural = "files have" if num_existing > 1 else "file has"
! 1309                 log.warning(
  1310                     f"{num_existing} {files_plural} already been downloaded "
  1311                     f"and will be skipped. To forcibly overwrite existing files, invoke "
  1312                     "the load or download function with `replace_existing=True`.",
  1313                     log_once=True,

  1316         fns = []
  1317 
  1318         for task_name, job in self.jobs.items():
  1319             if "error" in job.status:
! 1320                 log.warning(f"Not downloading '{task_name}' as the task errored.")
! 1321                 continue
  1322 
  1323             job_path = self._job_data_path(task_id=job.task_id, path_dir=path_dir)
  1324 
  1325             if job_path.exists():
! 1326                 if replace_existing:
! 1327                     log.info(f"File '{job_path}' already exists. Overwriting.")
  1328                 else:
! 1329                     log.info(f"File '{job_path}' already exists. Skipping.")
  1330                     continue
  1331 
  1332             if job.load_if_cached:
  1333                 job._materialize_from_stash(job_path)

  1332             if job.load_if_cached:
  1333                 job._materialize_from_stash(job_path)
  1334                 continue
  1335 
! 1336             def fn(job=job, job_path=job_path) -> None:
! 1337                 job.download(path=job_path)
  1338 
! 1339             fns.append(fn)
  1340 
  1341         if not fns:
  1342             return
  1343 
! 1344         with ThreadPoolExecutor(max_workers=self.num_workers) as executor:
  1345             futures = [executor.submit(fn) for fn in fns]
  1346 
  1347             if self.verbose:
  1348                 console = get_logging_console()

  1360                         completed += 1
  1361                         progress.update(pbar, completed=completed)
  1362             else:
  1363                 # Still ensure completion if verbose is off
! 1364                 for _ in concurrent.futures.as_completed(futures):
! 1365                     pass
  1366 
  1367     def load(
  1368         self,
  1369         path_dir: PathLike = DEFAULT_DATA_DIR,

  1407 
  1408         loaded = {task_name: job.load_if_cached for task_name, job in self.jobs.items()}
  1409 
  1410         if not skip_download:
! 1411             self.download(path_dir=path_dir, replace_existing=replace_existing)
  1412 
  1413         data = BatchData(
  1414             task_paths=task_paths,
  1415             task_ids=task_ids,

tidy3d/web/api/run.py

  235     if len(h2sim) == 1:
  236         if path is not None:
  237             # user may submit the same simulation multiple times and not specify an extension, but dir path
  238             if not Path(path).suffixes:
! 239                 path = f"{path}.hdf5"
! 240                 console = get_logging_console()
! 241                 console.log(f"Changed output path to {path}")
  242         else:
  243             path = DEFAULT_DATA_PATH
  244         h, sim = next(iter(h2sim.items()))
  245         data = {

tidy3d/web/api/webapi.py

  337     if entry is not None:
  338         try:
  339             entry.materialize(Path(path))
  340             return True
! 341         except Exception:
! 342             return False
! 343     return False
  344 
  345 
  346 def restore_simulation_if_cached(
  347     simulation: WorkflowType,

  374     retrieved_simulation_path = None
  375     if simulation_cache is not None:
  376         sim_for_cache = simulation
  377         if isinstance(simulation, (ModeSolver, ModeSimulation)):
! 378             sim_for_cache = get_reduced_simulation(simulation, reduce_simulation)
  379         entry = simulation_cache.try_fetch(simulation=sim_for_cache, verbose=verbose)
  380         if entry is not None:
  381             if path is not None:
  382                 copied = _copy_simulation_data_from_cache_entry(entry, path)

  424             task_id=None,
  425             path=str(restored_path),
  426         )
  427         if isinstance(simulation, ModeSolver):
! 428             simulation._patch_data(data=data)
  429         return data
  430     else:
! 431         return None
  432 
  433 
  434 @wait_for_connection
  435 def run(

  589     """Log task and folder links to the web UI."""
  590     if (task_type in ["RF", "COMPONENT_MODELER", "TERMINAL_COMPONENT_MODELER"]) and isinstance(
  591         simulation, TerminalComponentModeler
  592     ):
! 593         url = _get_url_rf(group_id or resource_id)
  594     else:
  595         url = _get_url(resource_id)
  596 
  597     if folder_id is not None:
! 598         folder_url = _get_folder_url(folder_id)
  599     else:
  600         folder_url = None
  601     return url, folder_url

  1390         task_id
  1391         and _is_modeler_batch(task_id)
  1392         and path.name in {"simulation_data.hdf5", "simulation_data.hdf5.gz"}
  1393     ):
! 1394         path = path.with_name(path.name.replace("simulation", "cm"))
  1395 
  1396     if task_id is None:
  1397         if not path.exists():
! 1398             raise FileNotFoundError("Cached file not found.")
  1399     elif not path.exists() or replace_existing:
  1400         download(task_id=task_id, path=path, verbose=verbose, progress_callback=progress_callback)
  1401 
  1402     if verbose and task_id is not None:

tidy3d/web/cache.py

  55         return self.path.exists() and self.artifact_path.exists() and self.metadata_path.exists()
  56 
  57     def verify(self) -> bool:
  58         if not self.exists():
! 59             return False
  60         checksum = self.metadata.get("checksum")
  61         if not checksum:
! 62             return False
  63         try:
  64             actual_checksum, file_size = _copy_and_hash(self.artifact_path, None)
! 65         except FileNotFoundError:
! 66             return False
  67         if checksum != actual_checksum:
  68             log.warning(
  69                 "Simulation cache checksum mismatch for key '%s'. Removing stale entry.", self.key
  70             )

  69                 "Simulation cache checksum mismatch for key '%s'. Removing stale entry.", self.key
  70             )
  71             return False
  72         if int(self.metadata.get("file_size", file_size)) != file_size:
! 73             self.metadata["file_size"] = file_size
! 74             _write_metadata(self.metadata_path, self.metadata)
  75         return True
  76 
  77     def materialize(self, target: Path) -> Path:
  78         """Copy cached artifact to ``target`` and return the resulting path."""

  107                 try:
  108                     shutil.rmtree(self._root)
  109                     if not hard:
  110                         self._root.mkdir(parents=True, exist_ok=True)
! 111                 except (FileNotFoundError, OSError):
! 112                     pass
  113 
  114     def _fetch(self, key: str) -> Optional[CacheEntry]:
  115         """Retrieve an entry by key, verifying checksum."""
  116         with self._lock:

  146             Representation of the stored cache entry.
  147         """
  148         source_path = Path(source_path)
  149         if not source_path.exists():
! 150             raise FileNotFoundError(f"Cannot cache missing artifact: {source_path}")
  151         os.makedirs(self._root, exist_ok=True)
  152         tmp_dir = Path(tempfile.mkdtemp(prefix=TMP_PREFIX, dir=self._root))
  153         tmp_artifact = tmp_dir / CACHE_ARTIFACT_NAME
  154         tmp_meta = tmp_dir / CACHE_METADATA_NAME

  178                         )
  179                         os.replace(final_dir, backup_dir)
  180                     # move tmp_dir into place
  181                     os.replace(tmp_dir, final_dir)
! 182                 except Exception:
  183                     # restore backup if needed
! 184                     if backup_dir and backup_dir.exists():
! 185                         os.replace(backup_dir, final_dir)
! 186                     raise
  187                 else:
  188                     entry = CacheEntry(key=key, root=self._root, metadata=metadata)
  189                     if backup_dir and backup_dir.exists():
  190                         shutil.rmtree(backup_dir, ignore_errors=True)

  192                     return entry
  193         finally:
  194             try:
  195                 if tmp_dir.exists():
! 196                     shutil.rmtree(tmp_dir, ignore_errors=True)
! 197             except FileNotFoundError:
! 198                 pass
  199 
  200     def invalidate(self, key: str) -> None:
! 201         with self._lock:
! 202             entry = self._load_entry(key)
! 203             if entry:
! 204                 self._remove_entry(entry)
  205 
  206     def _ensure_limits(self, incoming_size: int) -> None:
  207         max_entries = self.max_entries
  208         max_size_bytes = int(self.max_size_gb * (1024**3))

  212             self._evict(entries, keep=max_entries - 1)
  213             entries = list(self._iter_entries())
  214 
  215         if max_size_bytes == 0:  # no limit
! 216             return
  217 
  218         existing_size = sum(int(e.metadata.get("file_size", 0)) for e in entries)
  219         allowed_size = max(max_size_bytes - incoming_size, 0)
  220         if existing_size > allowed_size:
! 221             self._evict_by_size(entries, existing_size, allowed_size)
  222 
  223     def _evict(self, entries: Iterable[CacheEntry], keep: int) -> None:
  224         sorted_entries = sorted(entries, key=lambda e: e.metadata.get("last_used", ""))
  225         to_remove = sorted_entries[: max(0, len(sorted_entries) - keep)]

  228 
  229     def _evict_by_size(
  230         self, entries: Iterable[CacheEntry], current_size: int, allowed_size: float
  231     ) -> None:
! 232         if allowed_size < 0:
! 233             allowed_size = 0
! 234         sorted_entries = sorted(entries, key=lambda e: e.metadata.get("last_used", ""))
! 235         reclaimed = 0
! 236         for entry in sorted_entries:
! 237             if current_size - reclaimed <= allowed_size:
! 238                 break
! 239             size = int(entry.metadata.get("file_size", 0))
! 240             self._remove_entry(entry)
! 241             reclaimed += size
! 242             log.info(f"Simulation cache evicted entry '{entry.key}' to reclaim {size} bytes.")
  243 
  244     def _iter_entries(self) -> Iterable[CacheEntry]:
  245         if not self._root.exists():
! 246             return []
  247         entries: list[CacheEntry] = []
  248         for child in self._root.iterdir():
  249             if child.name.startswith(TMP_PREFIX) or child.name.startswith(TMP_BATCH_PREFIX):
  250                 continue

  249             if child.name.startswith(TMP_PREFIX) or child.name.startswith(TMP_BATCH_PREFIX):
  250                 continue
  251             meta_path = child / CACHE_METADATA_NAME
  252             if not meta_path.exists():
! 253                 continue
  254             try:
  255                 metadata = json.loads(meta_path.read_text(encoding="utf-8"))
! 256             except Exception:
! 257                 metadata = {}
  258             entries.append(CacheEntry(key=child.name, root=self._root, metadata=metadata))
  259         return entries
  260 
  261     def _load_entry(self, key: str) -> Optional[CacheEntry]:

  263         if not entry.metadata_path.exists() or not entry.artifact_path.exists():
  264             return None
  265         try:
  266             metadata = json.loads(entry.metadata_path.read_text(encoding="utf-8"))
! 267         except Exception:
! 268             metadata = {}
  269         entry.metadata = metadata
  270         return entry
  271 
  272     def _touch(self, entry: CacheEntry) -> None:

  311                     f"Simulation cache hit for workflow '{workflow_type}'; using local results."
  312                 )
  313 
  314             return entry
! 315         except Exception as e:
! 316             log.error("Failed to fetch cache results: " + str(e))
  317 
  318     def store_result(
  319         self,
  320         stub_data: WorkflowDataType,

  330         try:
  331             simulation_obj = getattr(stub_data, "simulation", None)
  332             simulation_hash = simulation_obj._hash_self() if simulation_obj is not None else None
  333             if not simulation_hash:
! 334                 return
  335 
  336             version = _get_protocol_version()
  337 
  338             cache_key = build_cache_key(

  352                 key=cache_key,
  353                 source_path=Path(path),
  354                 metadata=metadata,
  355             )
! 356         except Exception as e:
! 357             log.error(f"Could not store cache entry: {e}")
  358 
  359 
  360 def _copy_and_hash(
  361     source: Path, dest: Optional[Path], existing_hash: Optional[str] = None

  432 def _canonicalize(value: Any) -> Any:
  433     """Convert value into a JSON-serializable object for hashing/metadata."""
  434 
  435     if isinstance(value, dict):
! 436         return {
  437             str(k): _canonicalize(v)
  438             for k, v in sorted(value.items(), key=lambda item: str(item[0]))
  439         }
  440     if isinstance(value, (list, tuple)):
! 441         return [_canonicalize(v) for v in value]
  442     if isinstance(value, set):
! 443         return sorted(_canonicalize(v) for v in value)
  444     if isinstance(value, Enum):
! 445         return value.value
  446     if isinstance(value, Path):
! 447         return str(value)
  448     if isinstance(value, datetime):
! 449         return value.isoformat()
  450     if isinstance(value, bytes):
! 451         return value.decode("utf-8", errors="ignore")
  452     return value
  453 
  454 
  455 def build_cache_key(

  509     )
  510 
  511     try:
  512         return _CACHE
! 513     except Exception as err:
! 514         log.debug(f"Simulation cache unavailable: {err}")
! 515         return None
  516 
  517 
  518 resolve_local_cache()

@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch from 19688af to 4539ed9 Compare October 14, 2025 08:27
@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch from 4539ed9 to 80d7e47 Compare October 22, 2025 10:21
Copy link
Collaborator

@lucas-flexcompute lucas-flexcompute left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe this works for photonforge!
Just to make sure I understand correctly, we basically do:

cache = get_cache()
entry = cache.try_fetch(simulation)
if entry: ...
 # later, after loading results
cache.store_result(simulation_data, task_id, path)

Can I set path to be the actual cache path? I don't want the user having to worry about where the data will reside.

@lucas-flexcompute
Copy link
Collaborator

I also just realized there's no way to retrieve the cached data without copying (

shutil.copy2(self.artifact_path, target)
)
Can we have the option to skip copying? Imagine doing this for hundreds of files in a single circuit simulation… Ideally, I'd like to just retrieve the data itself.

@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch from 80d7e47 to fafeb3c Compare October 23, 2025 12:16
@marcorudolphflex
Copy link
Contributor Author

I also just realized there's no way to retrieve the cached data without copying (

shutil.copy2(self.artifact_path, target)

)
Can we have the option to skip copying? Imagine doing this for hundreds of files in a single circuit simulation… Ideally, I'd like to just retrieve the data itself.

I just added the option to specify no path (default).
So you could just use
load_simulation_if_cached(sim, use_cache=True)
if use_cache=None (default), the default cache setting is used (config+environment)

@marcorudolphflex
Copy link
Contributor Author

marcorudolphflex commented Oct 23, 2025

I believe this works for photonforge! Just to make sure I understand correctly, we basically do:

cache = get_cache()
entry = cache.try_fetch(simulation)
if entry: ...
 # later, after loading results
cache.store_result(simulation_data, task_id, path)

Can I set path to be the actual cache path? I don't want the user having to worry about where the data will reside.

I would just rely on
load_simulation_if_cached(sim)
to have a common interface. ̶T̶̶h̶̶i̶̶s̶ ̶d̶̶o̶̶e̶̶s̶ ̶t̶̶h̶̶e̶ ̶s̶̶t̶̶o̶̶r̶̶i̶̶n̶̶g̶ ̶a̶̶u̶̶t̶̶o̶̶m̶̶a̶̶t̶̶i̶̶c̶̶a̶̶l̶̶l̶̶y̶. If not found in cache, None is returned
Edit: storing is done with regular "run"

@lucas-flexcompute
Copy link
Collaborator

I would just rely on load_simulation_if_cached(sim) to have a common interface. This does the storing automatically.

Ah, I see! Sorry, I completely missed that function! No problems, then!

@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch from fafeb3c to c8e6049 Compare October 23, 2025 12:36
@marcorudolphflex
Copy link
Contributor Author

I would just rely on load_simulation_if_cached(sim) to have a common interface. This does the storing automatically.

Ah, I see! Sorry, I completely missed that function! No problems, then!

Perfect. One correction from my last comment: load_simulation_if_cached does not store in cache, but regular "run" does if cache is enabled or forced with "use_cache"

@lucas-flexcompute
Copy link
Collaborator

I would just rely on load_simulation_if_cached(sim) to have a common interface. This does the storing automatically.

Ah, I see! Sorry, I completely missed that function! No problems, then!

Perfect. One correction from my last comment: load_simulation_if_cached does not store in cache, but regular "run" does if cache is enabled or forced with "use_cache"

If I don't use run and load the results manually with web.load, I see that has the same argument and should work as well, right?

@marcorudolphflex
Copy link
Contributor Author

I would just rely on load_simulation_if_cached(sim) to have a common interface. This does the storing automatically.

Ah, I see! Sorry, I completely missed that function! No problems, then!

Perfect. One correction from my last comment: load_simulation_if_cached does not store in cache, but regular "run" does if cache is enabled or forced with "use_cache"

If I don't use run and load the results manually with web.load, I see that has the same argument and should work as well, right?

Yup! Technically, storing is done in load.

Copy link
Collaborator

@yaugenst-flex yaugenst-flex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @marcorudolphflex this will be a super useful feature to have!

@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch 2 times, most recently from 54acc71 to 99a21b6 Compare October 24, 2025 09:19
@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch 6 times, most recently from 9db9a0c to 86586e0 Compare October 27, 2025 11:34
@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch from 86586e0 to 6e39d6c Compare October 27, 2025 12:10
@yaugenst-flex yaugenst-flex self-requested a review October 27, 2025 12:34
@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch 2 times, most recently from 8b3d741 to df6c375 Compare October 27, 2025 16:49
@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch from df6c375 to 2b7c535 Compare October 28, 2025 08:18
@yaugenst-flex yaugenst-flex self-requested a review October 28, 2025 08:57
Copy link
Collaborator

@yaugenst-flex yaugenst-flex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @marcorudolphflex LGTM! Can you make sure all previous comments are resolved and then we can merge.

@marcorudolphflex marcorudolphflex added this pull request to the merge queue Oct 28, 2025
@yaugenst-flex yaugenst-flex removed this pull request from the merge queue due to a manual request Oct 28, 2025
Copy link
Collaborator

@momchil-flex momchil-flex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, this is much cleaner now!

My main comments are about making sure this is well documented for the user.

@marcorudolphflex marcorudolphflex force-pushed the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch from 3c650da to d4c2c89 Compare October 28, 2025 10:17
@yaugenst-flex yaugenst-flex added this pull request to the merge queue Oct 28, 2025
Merged via the queue into develop with commit 1d320b9 Oct 28, 2025
43 checks passed
@yaugenst-flex yaugenst-flex deleted the FXC-3294-add-opt-in-local-cache-for-simulation-results-hashed-by-simulation-runtime-context branch October 28, 2025 12:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants