Fix applying sensor scale on first read
In Sensor::addValue(), the adjustValue() function was being called
before the _scale member was getting set. So until the first interval
timer expired the sensor value would be unscaled. For example it could
show a temperature of 27000.0 instead of 27.0.
To fix this, reorder the code so that first _scale would get set, then
the sensor value would get adjusted based on _scale, and finally the
value could be set on the Value interface.
Tested:
Use a big INTERVAL config value so that I can check the first read of a
sensor does leave it unscaled. Then test with fix and check that it
is correct.
Signed-off-by: Matt Spinler <spinler@us.ibm.com>
Change-Id: Id0e2c71961f343246a42ba4c546e806350040e01
diff --git a/sensor.cpp b/sensor.cpp
index 3c1bbe5..09aeca6 100644
--- a/sensor.cpp
+++ b/sensor.cpp
@@ -150,8 +150,6 @@
_ioAccess->read(_sensor.first, _sensor.second,
hwmon::entry::cinput, std::get<size_t>(retryIO),
std::get<std::chrono::milliseconds>(retryIO));
-
- val = adjustValue(val);
}
#ifdef UPDATE_FUNCTIONAL_ON_FAIL
catch (const std::system_error& e)
@@ -167,7 +165,6 @@
auto iface =
std::make_shared<ValueObject>(bus, objPath.c_str(), deferSignals);
- iface->value(val);
hwmon::Attributes attrs;
if (hwmon::getAttributes(_sensor.first, attrs))
@@ -177,6 +174,9 @@
_scale = hwmon::getScale(attrs);
}
+ val = adjustValue(val);
+ iface->value(val);
+
auto maxValue = env::getEnv("MAXVALUE", _sensor);
if (!maxValue.empty())
{