You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all,thank you very much for your work.
I would like to know that the result ev1 and ev2 of the synthesized data in "deephit_competing_risk.ipynb" is 0.3 different from that of the original author(Deephit: A deep learning approach to survival analysis with competing risks. In Thirty-Second AAAI Conference on Artificial Intelligence, 2018). May I ask why this is?
Thank you for your answer.
The text was updated successfully, but these errors were encountered:
'm sorry if I didn't describe my question well, but my question is not about the content of the data set. What I mean is that using composite data, your deephit_competing_risks presents a c_index result that is 0.3 points worse than the one the author wrote in the paper. Not sure how to solve this problem?
'm sorry if I didn't describe my question well, but my question is not about the content of the data set. What I mean is that using composite data, your deephit_competing_risks presents a c_index result that is 0.3 points worse than the one the author wrote in the paper. Not sure how to solve this problem?
Please note that in the example ipynb, "The survival function obtained with predict_surv_df is the probability of surviving any of the events, and does, therefore, not distinguish between the event types.
This means that we evaluate this "single-event case" as before."
First of all,thank you very much for your work.
I would like to know that the result ev1 and ev2 of the synthesized data in "deephit_competing_risk.ipynb" is 0.3 different from that of the original author(Deephit: A deep learning approach to survival analysis with competing risks. In Thirty-Second AAAI Conference on Artificial Intelligence, 2018). May I ask why this is?
Thank you for your answer.
The text was updated successfully, but these errors were encountered: