Download presentation
Presentation is loading. Please wait.
Published byRadim Špringl Modified over 5 years ago
1
Model Enhanced Classification of Serious Adverse Events
SCOPE 2019
2
Jingshu Liu Lead Data Scientist, Medidata Solutions
3
Background
4
“Very high blood pressure” “Bruising on hip due to fall”
“Headache for 2 days” “Stroke” “Rash on leg” “Tingling in fingers+toes” Adverse Events (AE) Serious Adverse Events (SAE)
5
Challenges with SAEs Classification
Different standards Within and among organizations
6
SAE Definition An event is considered serious when the patient outcome is: Death Life-threatening Hospitalization (initial or prolonged) Disability or permanent damage Congenital anomaly or birth defect Required intervention to prevent permanent impairment Other important medical events (that may lead to the above) Medical and scientific judgement should be exercised in deciding if an event is ‘serious’ in accordance with these criteria FDA supplements this definition with such categories as “events that may lead to any of the above outcomes, or require intervention to prevent those outcomes, upon appropriate medical judgment”, and “unexpected events for the target condition or population”. A few years ago, in an FDA briefing document on a diabetes drug, it was pointed out that the investigator failed to report 3 stroke events as serious when the patient was not hospitalized.
7
Challenges with SAEs Classification
Different standards Quantitative evidence is frequently lacking Within and among organizations Review is time-consuming Systemic and human error Compounds the problem among over 100 stroke events recorded in those trials, 94% were reported to be serious.
8
Detection of SAEs is a complex process
We have heard from our sponsors that in smaller phase I and II trials, they often review every single adverse event, and in larger phase III trials in which that is infeasible, they review a sampling of all events. Yet fact is serious adverse events only occur a small percentage of the time, so most of the events that get reviewed end up being classified as non-serious.
9
Data for Medical Review
Clinical database External data (e.g. lab data, ECG, imaging data, ePRO, diaries, etc.) Other reports (e.g. admission/discharge reports, autopsy reports, etc.) Important Medical Event (IME) list IME by the Eudravigilance group. A report from the Clinical Trials Transformation Initiative estimated a median review time of 15 minutes per SAE.
10
Challenges with SAEs Classification
Different standards Quantitative evidence is frequently lacking Within and among organizations Review is time-consuming Systemic and human error Many factors to consider Compounds the problem
11
Generate insights from standardized industry AE data
12
>15,000 >4,400,000 Trials Trial subjects
Medidata Enterprise Data Store >15,000 >4,400,000 Trials Trial subjects
13
1M 1.8K 130 AE records Trials Sponsors Adverse Events Data
- 6% serious Completed Give-to-Get data rights all the records are de-identified. * Data is split by 7:2:1 by patient into training, validation and test set
14
Seriousness rate increases with severity grade
Number of events: 672K K K K K Seriousness rate increases with severity grade % serious
15
AEs with high seriousness rate
98% % % % % % % % % % AEs with high seriousness rate Frequency * Events with more than 100 occurrences in dataset
16
Classify SAEs with multivariate probabilistic models
17
Important Medical Event (IME)
Age Features Severity Race AE Demographics Sex MedDRA terms ... First event? Event Duration Serious/severity of events Patient Event History ... Multiple events? Outcome Concurrent Events Time between events Study-level features ... Important Medical Event (IME) Sponsor ... Phase Labs* Med Hx* Indication Hospitalization* * These features have not yet been incorporated into our models but will be explored in future work.
19
How Do We Evaluate Model Performance?
Less likely More likely true positives false positives Precision: 0.75 Recall: 0.5 Precision: 0.5 Recall: 0.75 false negatives true negatives SAE Non-SAE How many selected events are SAEs? How many SAEs are selected? Precision = Recall =
20
Benchmark Model: IME + Severity
Rule: Classified as SAE if AE on IME list or Severity = Grade 3 (Severe) or higher Random Model Precision: 0.29 Recall: 0.79
21
Model 1: Logistic regression
Features: Severity, MedDRA PT, indication, average previous occurrence seriousness, age Random Model Precision: 0.43 Recall: 0.90
22
Model 2: Neural Network Learns interaction between features of each event
23
Model 2: Neural Network Learns interaction between features of each event Learns complex interactions between concurrent events
24
Model 2: Neural Network Learns interaction between features of each event Learns complex interactions between concurrent events Learns how to use patient history without manual feature engineering
25
Model 2: Neural Network Features: All event-, patient-, study-level features (e.g., Severity, MedDRA PT, event on IME or not, etc.) Neural Net Random Model Precision: 0.60 Recall: 0.95 * Area under the PR curve (PR-AUC): Logistic regression , Neural Net
26
Medical review: A New Hope
27
Challenges Addressed Review is time-consuming
Human error + different standards Lacking quantitative evidence
28
Challenges Addressed Review is time-consuming
Human error + different standards Lacking quantitative evidence
29
Challenges Addressed Amount of review needed to capture 99% of SAEs
Review is time-consuming Human error + different standards Lacking quantitative evidence IME + Severity Neural Net 96% 41%
30
Value of Model-Enhanced SAE Classification
Improves Accuracy Reduces Time Reduces Workload Our model outperforms the industry baseline of IME + Severity Grade by a large margin Our model can evaluate an event and the corresponding patient profile to make a prediction in seconds Our model prioritizes records for review with higher precision
31
Thank you. For more information, please contact:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.