Download presentation
Presentation is loading. Please wait.
Published byRudolph Black Modified over 8 years ago
1
A2 Agreement Trial ICT November 2013
2
Outline Agenda 10.00 Welcome and introductions 10.05Travel Expenses 10.10GCE ICT –Entries and Transitional Arrangements 10.152013 Outcomes and Issues 10.45Coffee 11.15Principal Moderator’s Report Exemplar Materials 12.30Lunch 1.15Exemplar Materials 3.00Plenary
3
Objectives To inform Centres of the new A2 Assessment Arrangements To provide feedback from the A2 Award 2013 To agree marking standards and assessment criteria for the exemplar materials
4
Travel Expenses Attendees at CCEA agreement trials are entitled to be reimbursed for their travel expenses The process by which this happens has changed slightly because of the introduction of changes linked to auto- enrolment (intro of workplace pensions for all) CCEA are required to provide real time information (RTI) to HMRC about all payments made, including travel expenses To allow for this to happen all attendees are required to complete a Personal Data Form Payments due to you cannot be processed until you have completed the Personal Data Form The Personal Data Form can be completed by visiting http://www.ccea.org.uk/autoenrolment http://www.ccea.org.uk/autoenrolment
5
GCE ICT Entry Details Unit Previous Component Code New Component Code Revised Weighting Revised UMS Implementation Date A21AW211AP211 60% of A2 30% of A-level 120 UMS For first examination in January 2014 A22AW221AP22140% of A2 20% of A-level 80 UMSFor first submission in May 2014 New cash-in code: A2654 for first A2 level cash-in summer 2014
6
GCE ICT Entry Details For Resit Candidates: Previous weightings will still hold for candidates who began the course prior to September 2012 i.e. candidates in Y14 who are resitting A2 units The final resit opportunity for A2 candidates, who began the course prior to September 2012 under the old weightings, will be in January 2014 This includes resitting Coursework Component Existing component codes must be used – AW211, AW221
7
The Current A2 Award The current A2 Award Coursework grade boundaries are high Examination performance has improved at the upper end of the mark range The new A2 Award More balanced outcomes predicted!
8
Changes to A Level ICT AS Unit Weightings From September 2012 40%( Coursework)/60% (Examination) A2 Unit Weightings From September 2013 40%( Coursework)/60% (Examination)
9
Coursework Weightings SectionMark Analysis20 Design20 Implementation and Testing20 Documentation10 Evaluation10 Total Mark 80 This will not change in September 2013
10
Implications of change Content in A21 will not increase to reflect the change of weighting ( from 50% to 60%) Content of A22 will not decrease to reflect the change of weighting ( from 50% to 40%) Change of weightings should contribute to a more balanced award outcome in terms of grade boundaries Encourages Candidates to perform better in the examination Module
11
What is Moderation ? CCEA request sample from Centre Moderation team meet to agree standards Each Moderator will be allocated a number of Centres Moderator will sub sample each centre (N/2) + 1 including top and bottom If sub sample is within tolerance, process finishes Tolerance is a tool only available to Moderators If sub sample is not fine, then the whole sample is moderated Moderator does not change Centre marks, only recommends a change Moderator’s are supervised during process to ensure consistent application of standards
12
Moderation 2013 Majority of solutions involved a relational database Majority of centres applied the assessment criteria successfully Evidence of Candidate’s building on their AS work to develop an A2 project
13
Moderation 2013 Teacher comments/annotation are an important aspect of moderation “Cutting and Pasting” assessment criteria indicators to justify marks awarded is not helpful to the Moderation Team The overall range and quality of solutions continues to be excellent
14
Encourage Candidates to submit solutions in clearly labelled sections as outlined in the specification Candidates need to fully detail the design part of the solution such as explaining each stage of Database Normalisation with supporting ER Models Centres should avoid using a common “Template” approach but should provide more individual guidance Moderation 2013
15
Templates Limited scope of problem identification Lack of third party user can result in vague user requirements Design can be limited due to inadequate Analysis Testing can be limited due to vague User Requirements
16
Centre Report Centres receive a TAC 6 and you can request follow up Moderation team will attempt to provide feedback for Centres to the next series This may involve positive comments as well as areas for development Need to carefully consider moderator comments: “ …slightly generous in ….” “…lenient in the..” “…slightly severe across all assessment criteria…”
17
Data modelling including ER models and Database normalisation System development life cycles Approaches to Software development Testing and Software maintenance User interfaces User support and ICT training. Implications of ICT within an organisation Linking AP211/AP221
18
CCEA Support 2013/14 Agreement Trials Centre Visits ( if appropriate) Coursework Clinic ( Feb 2014 ) Continued development of CCEA Microsite
19
Exemplar Materials 2013 1.AC Cars 2. Nicola’s Cakes 3.Army Cadet Force
23
Additional Support for new Centres
24
Define the nature of the problem to be solved Fact finding methods to investigate the problem Identify data sources Gather sample documents currently used Identify the current user activities Investigate the tasks carried out by the user Specify limitations of the current system Describe information requirements of a system State the objectives of new system. Analysis
25
ANALYSIS Mark Range A detailed coherent analysis has been produced for a demanding problem. Full discussion of information requirements including fact finding. Demonstrates an in-depth understanding of structured analysis techniques in the investigation and effective use of structured analysis tools in specifying the system. 16-20 A good analysis has been produced for a demanding problem or a detailed analysis has been produced for a less demanding problem. A discussion of information requirements including fact finding. Demonstrates good understanding of structured analysis techniques in the investigation and reasonable use of structured analysis tools in specifying the system 11-15 A good analysis of a simple problem or a limited analysis of a difficult problem. Requirements specification included but with little or no justification. Limited use made of structured methods in investigating and specifying. 6-10 A simple problem with little evidence of any analysis, aims not clearly identified, poor investigation and recording of findings. Little use made of structured methods in investigating and specifying. 0-5
26
Evaluate possible solutions Design and document data capture forms Design of user interface Describe data validation required Design and document data structures Choose appropriate hardware and software Relate the solution to the capabilities of the software and hardware. Design
27
Software solution developed from the design (not documented) No Explicit Software Development section Evidence of testing will imply software has been developed. A test plan produced from the system objectives including: Valid, invalid and extreme data; Testing of the user interface System functionality Evidence of user testing Implementation and Testing
28
Output from the testing, cross referencing the test plan A description of a strategy for implementing the system into the organisation Implementation plan Description of the system changeover The training required The problems encountered and the actions taken Description of how existing data is converted for use in the new system Implementation and Testing
29
Mark Range Evidence of a full and effective software solution to a demanding problem. Clear evidence of a full and effective test plan for a demanding problem. The results of the testing are fully documented with outputs cross-referenced to the original plan. Evidence that all functions agreed upon with the user(s) are indeed present and correct. Corrective action taken due to test results clearly documented. Plan for implementing the solution 16-20 Evidence of a reasonable software solution to a demanding problem or an effective software solution to a less demanding problem. Evidence of a reasonable test plan for a demanding problem. Test plan followed in a systematic way but the test plan has omissions in it and/or not all cases have been tested (i.e. have no evidence of testing). Some documentation of corrective action taken due to test results. Plan for implementing the solution 11-15 Evidence of an effective software solution to a simple problem or a limited software solution to a demanding problem. Evidence of an effective test plan and cross-referenced outputs for a simple problem or a patchy/limited testing of a demanding problem. Brief plan for implementing the solution 6-10 Inadequate software solution to a simple problem. Inadequate test strategy and test plan devised or a plan followed in a limited fashion. Little or no hard copy evidence of the results of testing or implementation of solution 0-5
30
Installation Guide Step by step operating instructions Troubleshooting Backup procedures. User Documentation
31
Evaluate results against objectives Identify strengths and limitations of the final system Identify possible extensions to the system Evaluation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.