Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix custom update #220

Closed
wants to merge 2 commits into from
Closed

Conversation

aeudes
Copy link
Contributor

@aeudes aeudes commented Oct 27, 2019

This PR fix two different problems related to custom function update.

=== When streaming (eda7c54) ===
If custom function is used, when streaming time is reset, the custom function is not updated anymore.
Step by step to make the problem happend :

  • start streaming
  • add custom function related to topic and plot it
  • start the bag
  • everything is ok
  • restart the bag
  • buffer is cleared and normal topic are display again but custom topic are not updated.

=== During data loading (e8649d7) ===
If data are loaded after custom function is defined, the custom function is not updated.
Step by Step:

  • load layout with custom function define in it (keep placeholder).
  • load data from bag.

@facontidavide
Copy link
Owner

Hi,

I will review this as soon as possible and merge it ASAP.

@facontidavide facontidavide self-assigned this Oct 27, 2019
@facontidavide
Copy link
Owner

I am not sure I understand how the changes in CustomFunction::calculate fix the issues.

Applying only the change in MainWindow::updateDataAndReplot seems to be sufficinet to fix the error

@aeudes
Copy link
Contributor Author

aeudes commented Oct 28, 2019

The bug is hiding in the fact that when time is reset, in this loop:

for(size_t i=0; i < src_data.size(); ++i)
{
if( src_data.at(i).x > _last_updated_timestamp)
{
dst_data->pushBack( calculatePoint(calcFct, src_data, channel_data, chan_values, i ) );
}
}
if (dst_data->size() != 0){
_last_updated_timestamp = dst_data->back().x;
}

the time index _last_updated_timestamp is not reset and cannot be less than source time. As the buffer is reset, dst_data is empty and _last_updated_timestamp is not updated.

I have changed from time constraint to index as it seems more clear to me to keep the transform progression this way. Maybe I miss something, I can reword the pull request without changing the way the indexing is done if you prefer.
The other part of this change allows to avoid retrieving the auxiliary data, if there is no new source data (as the other commit force to call the update more often).

@facontidavide
Copy link
Owner

Ok, so what about this code?

    if (dst_data->size() == 0){
      _last_updated_timestamp = - std::numeric_limits<double>::max();
    }
    else{
      _last_updated_timestamp = dst_data->back().x;
    }

    for(size_t i=0; i < src_data.size(); ++i)
    {
        if( src_data.at(i).x > _last_updated_timestamp)
        {
            dst_data->pushBack( calculatePoint(calcFct, src_data, channel_data, chan_values, i ) );
        }
    }

Based on what you said, it might make more sense to move the update of _last_updated_timestamp before the for-loop

@facontidavide
Copy link
Owner

actually I am realizing that based on the fact that timestamp must be monotonic, I don't even need to remember _last_updated_timestamp... i can write this code instead:

for(size_t i=0; i < src_data.size(); ++i)
{
    if( dst_data->size == 0 ||  src_data.at(i).x >=  dst_data->back().x)
    {
        dst_data->pushBack( calculatePoint(calcFct, src_data, channel_data, chan_values, i ) );
    }
}

@aeudes
Copy link
Contributor Author

aeudes commented Oct 28, 2019

Ok, this one is better. I am a bit bothered by the monotonic assumption on time as it is not true for time-series but the original code rely on it so....
I am still sure that the reset of src must be detected to clear the dst buffer (as it is not done when buffer is cleared and it is a push_back). I will make some test in this direction and proposed you an update of this PR.

if (dst_data->size > src_data.size())
  dst_data->clear()

for(size_t i=0; i < src_data.size(); ++i)
{
    if( dst_data->size == 0 ||  src_data.at(i).x >=  dst_data->back().x)
    {
        dst_data->pushBack( calculatePoint(calcFct, src_data, channel_data, chan_values, i ) );
    }
}

@facontidavide
Copy link
Owner

facontidavide commented Oct 28, 2019

Confirm to me, but the streaming bug happens ONLY when you use this option:

image

Am I right?

@facontidavide
Copy link
Owner

Unfortunately this condition is triggered spuriously...

        if (dst_data->size > src_data.size())  dst_data->clear()

With the latest changes, both the original timeseries and the custom one need a "Clear Points" reset when you "rewind time".

I am going to push the changes myself and close this PR.

facontidavide added a commit that referenced this pull request Oct 28, 2019
@aeudes
Copy link
Contributor Author

aeudes commented Nov 3, 2019

Sorry for long time delay in answering your questions and testing the new changes:

  • The bug is trigger even if timestamp did not come from header.
  • I have test your modification but when a partial buffer clear is done from ros streaming:
    if( msg_time < _prev_clock_time )
    {
    // clean
    for (auto& it: dataMap().numeric ) {
    it.second.clear();
    auto dst = _destination_data->numeric.find(it.first);
    if( dst != _destination_data->numeric.end()){
    dst->second.clear();
    }
    }
    for (auto& it: dataMap().user_defined ){
    it.second.clear();
    auto dst = _destination_data->user_defined.find(it.first);
    if( dst != _destination_data->user_defined.end()){
    dst->second.clear();
    }
    }
    }

I think your solution still not reset the custom data. As it is still good to have only partial reset and not silently clean other data source: I have made a proposal(0a6ad3a) in the updated branch to force clean of custom transform when reset time is detected.

  • Also with the new loop when no monotonic stamp are processed, the custom will be processed only monotonically (contrary to previous behavior) and will possibly drop some data. This will cause custom time series to have different size/index than original series. I am not sure this is desired behavior. See 0ff62f7.

facontidavide added a commit that referenced this pull request Nov 10, 2019
facontidavide added a commit that referenced this pull request Nov 10, 2019
@facontidavide
Copy link
Owner

I applied your changes. I think this should do. Thanks again!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants