Download presentation
Presentation is loading. Please wait.
Published byPatricia Baldwin Modified over 6 years ago
1
Storing evaluation models, criteria, and metrics
CTSA Ontology Workshop, Orlando, FL Dagobert Soergel Department of Library and Information Studies Graduate School of Education University at Buffalo
2
Prolog. Integrated Information Infrastructure
Information system to support many CTSA functions support for discovering specific expertise, collaboration opportunities, data sets, specimens, apparatus, software, ; support for research: managing clinical studies, data analysis providing a program income system for resources and services and grant budget management managing the many interdependent activities and deadlines (project management); tracking, assessment, and evaluation.
3
Prolog. Integrated Information Infrastructure
Needs many types of data, just some examples many kinds of people, their expertise, projects, publications, and groups to which they belong; resources, such as animal models, reagents, cell lines, data sets, core facilities; types of research supported, instances of research supported, users, publications; training opportunities, their objectives, content, trainee participants, mentors, evaluation.
4
Prolog. Integrated Information Infrastructure
Communicating Oracle database and triple store Open: Tables, triples, queries can be added freely Key: Everyone must use defined ontologies – entity types, relationship types, entity values Oracle tables Oracle queries Triple store SPARQL queries
5
Storing a logic model A logic model is a set of assertions of the form
Factor/variable/construct A <hasEffectOn> (Factor/variable/construct B, EffectDirectionAndSize) Example UsingStandardOntologies <hasEffectOn> (DataIntegrability, +5) DataIntegrability <hasEffectOn> (DrugDiscovery, +3)
6
Storing milestones Data to be stored about milestones. Examples Milestone <occursOnDate> Date (Date can be recurring) Milestone <hasCriterion> (CM, Value) CM = Construct or Metric Milestone <hasResponsiblePerson> Person Milestone <hasNotificationSpan> Duration
7
Storing data on criteria and metrics
In evaluation it may make sense to distinguish between evaluation criteria or constructs, for example research productivity and metrics that operationalize these constructs, for example % proposals funded as is commonly done in the social sciences. On the other hand, the boundary may not be sharp. I will use CM to designate any entity that might be considered a construct or a metric.
8
Storing data on criteria and metrics
A CM is a property of some entity. The value of a CM can be derived from data through a specified procedure and/or computed from other CMs. Values can be given on a nominal, ordinal, ratio, or interval scale or they may be simply a list of accomplishments, such as a list of three drugs discovered in a program over the last three years with some report on the use or potential use of theses drugs.
9
Storing data on criteria and metrics
The next slide shows some examples of the data to be stored. One application is an inventory of all evaluation criteria/measures used in the CTSA consortium
10
Type of data, relationship type
CM <indicatedBy> CM CM <operationalizedBy> CM CM <computedWith> Formula CM <usedInComputing> (CM, Direction, Power) CM <isUsefulFor> (DecisionType, Agent) CM <usedBy> (LegalEntity, Decision) CM <determinedWithFrequency> Interval CM <includedIn> InformationArtifact
11
S T
12
Data needed to compute CMs 1
Research integrated with patient care Community engaged in research • # studies in which clinical and research institutions collaborate • Research priorities stated through community engagement Bridging basic►pre-clinical►clinical • Case studies of research collaboration across the entire pipelineGenomics & imaging to patient care Research collaboration across disciplines and institutions. Team Science • Social Network analysis: Co-researchers, co-authors, members on review and study support teams • Multicenter trials • Institutional & corporate collaborations
13
Data needed to compute CMs 2
Research productivity and quality • % proposals funded • # pilot studies to funded studies • % studies meeting accrual goals • Usage of services, by type • # % of researchers using services • # publications and their impact Timeliness of research. Time from • IRB submission ► approval • grant award ► study opening • study completion ► publication • study completion ► application • #studies moving from one phase to next
14
Data needed to compute CMs 3
Improved data sharing, reusability, and security • % studies with excellent data security • % studies using standard ontologies Trainees educated for research careers • # participants in educational events • Career trajectory (i.e., career choices and success of trainees) • publications and funding of trainees
15
Data needed to compute CMs 4
Research findings relevant to prevention and treatment • Inventory of research findings and their translation potential Changes to practice guidelines • Inventory of changes based on findings of BTC research and their significance New or modified diagnostics, therapeutics, and prevention and their actual use: - new drugs / # NDAs / # INDs, - improved/new uses of drugs, - new devices, - patents - new procedures • Inventory with indication of significance Improved health in Buffalo region • Public health measures, especially as applied to underrepresented groups
16
The end
17
Storing data on criteria and metrics
Definitions (sort of) In the context of survey research, a construct is the abstract idea, underlying theme, or subject matter that one wishes to measure using survey questions. Some constructs are relatively simple (like political party affiliation) and can be measured using only one or a few questions, while other constructs are more complex (such as employee satisfaction) and may require a whole battery of questions to fully operationalize the construct to suit the end user's needs.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.