Download presentation
Presentation is loading. Please wait.
Published byGeraldine Griffith Modified over 6 years ago
1
Collaborating to overcome the obstacles to OA
Open Education and Open Access Panel CC Summit April 14th 2018, Toronto This work is licensed under a Creative Commons Attribution 4.0 International License. Tom Cochrane, Leslie Chan, Kathleen Shearer, Nic Suzor, Timothy Vollmer
2
Creative Commons founding date 2001
A shared chronology Creative Commons founding date 2001 in Australia from 2004 Budapest OA declaration 2002 OA starts in Australia with QUT in 2003/4 Early confusion about relationship between the two Some continuing debates
3
The big picture, research and scholarship
Revolutionary change in: discovery techniques communication and sharing techniques evidence in observations potential reach and impact
4
At its core, research as a human activity
Generation and verification of new knowledge Certification of its quality Its dissemination Subsequent recognition
5
But…. Control of steps 3 and 4 long ago ceded to a powerful oligopoly of control AND… Processes in the academy for recognising excellence tied to metrics controlled by the same interests
6
The linchpin of this control is..
Control and lockdown through copyright assignment Leading to a closed system….
7
Problems of this closed system
integrity and fraud inefficiency (slowness) unnecessary limitations on recognition and impact cost
8
Dimensions of the problem
global expenditure estimates – collective outlays by research institutions the economics of moral hazard varying estimates of the margin between cost and price of academic and research publishing current (disappointing) tide level for OA
9
Conversely… Increasing traditional scholarly metrics (citation)
Optimising dissemination speed Simplifying (“de-magnifying”) dissemination costs Improving integrity and honesty (reproducibility) Broadening dissemination beyond the academy Involving new ‘lay’ communities (citizen science) Enhancing fresh discovery on existing material (re-use)
10
The Obstacle 1. surrendered control of research outputs leading to extraordinary profitable business 2. that business has made its control of metrics an exclusive essential in the assessment of research quality (the JIF) 3. at institutional level there is insufficient leadership to coordinate a rational response to this distortion
11
Ongoing confusion Lack of coordination enhanced by the competitive state of the research system together with Systematic and systemic confusion “on the ground” about copyright, its assignment, and licensing alternatives
12
Leslie Chan Kathleen Shearer Nic Suzor Timothy Vollmer
Panel Discussion Leslie Chan Kathleen Shearer Nic Suzor Timothy Vollmer
13
Summary The research system as a whole is in trouble. It is increasingly driven by competitive but unauthentic metrics which perpetuate a status quo of institutional rankings, which are under continual methodological challenge. What’s worse, is that this process also supports the highly profitable rent seeking of those who provide the basis of these unauthentic metrics. The ultimate evidence of the degree of this trouble now is the practice of providing academics huge cash prizes in some university systems for landing an acceptance in a HIF journal. It is quite possible that rich publishers will (or are perhaps already) support(ing) such incentives. It continues to be underpinned by the signing over of IP.
14
A radical alternative A modern properly net-based system to verify and certify research outcomes on a global scale. Such a system would be one part of a more open approach to research, one already described by the concept of open science, including the interoperable accessibility (and therefore verifiability) of processes, sophisticated linking of data and text and software, and high speed sharing and communication of important results. Such an ecosystem can underpin both public spending efficiencies and rapid diffusion, impact and private sector innovation.
15
Remedies – action and leadership to:
develop and support open frameworks harmonise supporting IP regimes reframe researcher induction improve data and tools support services reward data science methods and re-use techniques rationalise research quality markers foster impact tracking in diverse tools
16
Discussion ?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.