Conversation
| fldStIt->getVariableKeysFromFldSt(state, typeVar, keyCount, fldStIt->m_namesOfKeys, fldStIt->m_indexesForKeyVar); | ||
| for (std::string nm : fldStIt->m_namesOfKeys) { | ||
| // Gather key lists for each field set and build the union of all keys (optionally filtered) | ||
| for (auto &fldSt : m_annualFields) { |
There was a problem hiding this comment.
Lots of opportunities for range-based loop conversions in here. I applied as many as I could, except for places where we have nested iterator based loops indexing off one another.
| fldStIt->m_cell[tableRowIndex].result = 0.0; | ||
| fldSt.m_cell[tableRowIndex].indexesForKeyVar = (foundKeyIndex >= 0) ? fldSt.m_indexesForKeyVar[foundKeyIndex] : -1; | ||
| // Initialize result based on aggregation kind | ||
| switch (fldSt.m_aggregate) { |
There was a problem hiding this comment.
I applied switch/case where it made sense. All these if-else blocks based on enums seem prime for this pattern.
| constexpr Real64 veryLarge = 1.0E280; | ||
| constexpr Real64 verySmall = -1.0E280; |
There was a problem hiding this comment.
Maybe I shouldn't move these to the header?
| static Real64 const storedMaxVal(std::numeric_limits<Real64>::max()); | ||
| static Real64 const storedMinVal(std::numeric_limits<Real64>::lowest()); | ||
| static constexpr Real64 storedMaxVal(std::numeric_limits<Real64>::max()); | ||
| static constexpr Real64 storedMinVal(std::numeric_limits<Real64>::lowest()); |
There was a problem hiding this comment.
Is there any reason not to reuse these? Or even replace veryLarge and verySmall with these numeric_limits ?
There was a problem hiding this comment.
I think these are only used to avoid writing to the tables when there is no updated minVal/maxVal values. i.e., so it doesn't matter what these initializations are, just that they are larger or smaller than expected results.
minVal = storedMaxVal;
maxVal = storedMinVal;
if (curVal > maxVal) maxVal = curVal;
if (curVal < minVal) minVal = curVal;
if (minVal != storedMaxVal) {
tableBody(columnRecount, 15) = RealToStr(minVal, digitsShown);
}
if (maxVal != storedMinVal) {
tableBody(columnRecount, 16) = RealToStr(maxVal, digitsShown);
}
| fldSt.m_cell[tableRowIndex].result = -9.9e99; | ||
| break; | ||
| case AnnualFieldSet::AggregationKind::minimum: | ||
| case AnnualFieldSet::AggregationKind::minimumDuringHoursShown: | ||
| fldSt.m_cell[tableRowIndex].result = 9.9e99; |
There was a problem hiding this comment.
Same question for these? Replace with veryLarge or numeric_limits ?
There was a problem hiding this comment.
These are strange values to use for a table cell initialization. I'd have to see how these look in a table before I could suggest an alternative. I do think the use of "99999" data in cells means no data but using "99000" style seems different and I don't recall ever seeing that value in tables. Certainly there could be valid cell data larger than 99999.
Real64 realValue = -99999.0;
return -99999.0;
There was a problem hiding this comment.
They could just be veryLarge and verySmall
| } | ||
| fldStIt->m_cell[row].deferredElapsed.push_back(elapsedTime); // save the amount of time for this particular value | ||
| newDuration = oldDuration + elapsedTime; | ||
| // newDuration = oldDuration + elapsedTime; |
There was a problem hiding this comment.
This was unused. I Just commented it out for now.
| cell.result = -9.9e99; | ||
| break; | ||
| case AnnualFieldSet::AggregationKind::minimum: | ||
| case AnnualFieldSet::AggregationKind::minimumDuringHoursShown: | ||
| cell.result = 9.9e99; |
There was a problem hiding this comment.
veryLarge / numeric_limits
|
The only regression file I found that exercised this is PurchAir_wAnnual, if you don't force design days. Running regressions on that file (and the entire suite) passed locally with no diffs. |
|
I forced annual simulations on the entire test suite and ran regressions locally on 69778ff. No diffs. This is ready to go, unless there are questions or comments. |
|
@mitchute let me take a quick look, since I was the original author. |
|
No worries. Take your time. |
JasonGlazer
left a comment
There was a problem hiding this comment.
Looks good, sorry I didn't look at these changes sooner.
| fldStIt->getVariableKeysFromFldSt(state, typeVar, keyCount, fldStIt->m_namesOfKeys, fldStIt->m_indexesForKeyVar); | ||
| for (std::string nm : fldStIt->m_namesOfKeys) { | ||
| // Gather key lists for each field set and build the union of all keys (optionally filtered) | ||
| for (auto &fldSt : m_annualFields) { |
| fldSt.m_cell[tableRowIndex].result = -9.9e99; | ||
| break; | ||
| case AnnualFieldSet::AggregationKind::minimum: | ||
| case AnnualFieldSet::AggregationKind::minimumDuringHoursShown: | ||
| fldSt.m_cell[tableRowIndex].result = 9.9e99; |
There was a problem hiding this comment.
They could just be veryLarge and verySmall
| if ((fldStIt->m_aggregate == AnnualFieldSet::AggregationKind::maximum) || | ||
| (fldStIt->m_aggregate == AnnualFieldSet::AggregationKind::minimum)) { | ||
| for (auto const &fldSt : m_annualFields) { | ||
| switch (fldSt.m_aggregate) { |
Pull request overview
After working on #11410, I started looking around in here and realized I could spend a little time streamlining some of the things in this module without too much trouble. That effort uncovered the potential bug in #11420. I tried to focus on applying some modernization and best practices (as far as I understand them). If this causes issues for one reason or another, let me know.
Description of the purpose of this PR
Pull Request Author
Reviewer