Presentation is loading. Please wait.

Presentation is loading. Please wait.

GCE Software Systems Development

Similar presentations


Presentation on theme: "GCE Software Systems Development"— Presentation transcript:

1 GCE Software Systems Development
AS Agreement Trial October 2017

2 Thanks For this year it is acknowledged that there was a marked improvement for the centres delivering the AS syllabus. This is a year on year improvement and there was a significant improvement in the way work was presented, which helps the moderation process. The staff and students are to be commended for their hard work and efforts in submitting the work before the required deadline.

3 Agenda Welcome and Introduction Summer 2017 Malpractice Awareness
Administration Considerations General Moderation Findings Specification Overview The AS2 Assessment Criteria Exemplar 1-Table Marking Lunch Exemplar 2 -Table Marking Exemplar 3 – Table Marking Good Practice Snippets AOB

4 Summer 2017 Fifth Cohort AS award Work was submitted from 48 Centres
436 (85 resits) number of candidates for AS 1 372 (21 resits) number of candidates for AS 2

5 Malpractice awareness
JCQ Guidance ( Instructions for conducting controlled assessments Instructions for conducting coursework Expected malpractice in examination or assessments Plagiarism in examinations Authentication of candidate’s work Plagiarism, collusion or copying Pre- entry or pre-authentication school issue. Post- entry or post-authentication report to CCEA. Completion of internal assessments under the required conditions. Improper assistance: any act of assistance given beyond that permitted in specification or regulation. Important to read JCQ guidance, also has useful info for candidate guidance and school policies Teacher requirement to sign authentication that candidate’s work is own individual effort, completed under required subject conditions. Important this is completed or marks cannot be accepted. Signature is your confirmation all work is their own, no additional help given (if so should be noted and marked accordingly), and work reflects ability observed during course. Be aware of possible candidate malpractice: content taken from internet or other sources which is not been appropriately referenced; copy of JCQ guidance on this for you; students working collaboratively beyond what’s permitted and which could impact on ability to award fair individual mark; students copying – note: student allowing work to be copied is also malpractice and will be subject to a penalty which removes marks either from section or whole unit If candidate is entered a unit and has signed authentication documentation when malpractice found, should be reported to CCEA, otherwise use centre’s internal procedures Malpractice found annually in candidate work submitted for moderation. School will be asked to investigate – may be candidate or teacher malpractice (e.g. copying/plagiarism – would controls have allowed this to happen or does this indicate conditions not followed). Important spec guidelines on completing IA followed – where not, this is teacher malpractice Teacher malpractice: mostly likely to come under category improper assistance – any act where assistance is given beyond what is allowed to a candidate or group of candidates which results in a potential or actual advantage in assessment – depending on subject rules may vary, JCQ list includes: advice on specific improvements to meet the criteria; detailed advice/suggestions as to how work may be improved; providing writing frames (outlines, paragraph headings or section headings); etc Drafting: c’work – candidates free to revise and redraft without teacher involvement before submitting final piece/ CA – where allowed, teachers may review work and provide advice at a general level Similarities in students’ work noted during moderation will be investigated. Penalties for teacher malpractice listed in JCQ documentation.

6

7

8 Instructions for Conducting Coursework

9 14.5 The centre should inform candidates of the marks which have been submitted to the awarding body, but in doing so must make it clear that those marks are subject to change through the moderation process. Candidates should be advised of their marks within a sufficient window in order to allow time for any internal appeal to be concluded prior to the submission of centre marks to the awarding body.

10 Administration Considerations
Files for assessment must be submitted on CD/USB pens, we would recommend a USB pen as some centres submitting multiple CD’s. Submission must include the executable program, the project files and the report (1 document) for each candidate either PFD or WORD file. Imperative that centres indicate with comments and specific page numbers where candidates were awarded credit on the Electronic Candidate record sheets (ECRS). All username and password information should be recorded separately to aid moderation process, preferably in a separate document or in the ECRS for each candidate.

11 General Moderation Findings
Generally the marking was fair, consistent and within the agreed standard for the majority of centres. When centres did over mark it was more evident for A02. Full or high marks for A02 should only be considered where the coding, structure and content are excellent, and there are no issues with HCI, validation has been used correctly, the recording of scores has been done successfully, external files used and the game is generally bug free.

12 General Moderation Findings
The code should be efficiently written and the repetition of code should be at a minimum. Candidates should be encouraged to make good use of try catch, get set, and the use of specific or custom exceptions for example. Test plans where better but some centres still did not included appropriate test data values and for expected outcomes linked to requirements where applicable, and there was very few failed tests recorded. The use of randomisation linked to an external bank of questions giving the user a different experience should be encouraged.

13 Specification Overview
Unit Assessment Method Unit AS 1: Introduction to Object Oriented Development External Assessment 2 hour Question Paper Unit AS 2: Event Driven Programming Internal Assessment Development of interactive game/quiz Software solution and portfolio of evidence Unit A2 1: Systems Approaches and Database Concepts Unit A2 2: Implementing Solutions Based on pre-release material Software solution and Portfolio of evidence From 2017 AS 40% A2 60%

14 The AS2 Assessment Criteria

15 AO1 – Level 4 “Candidate demonstrates an excellent knowledge and understanding of the requirements of the system. This is evidenced by the provision of a comprehensive statement of the requirements of the application.”

16 AO1 – Level 4 An opening section outlining the background to the task.
This is followed by an exhaustive, detailed list of user requirements. The language used is important and it is unlikely that the client will use specific technical vocabulary e.g. Splash Screen, Drag and Drop etc. However, terms like menu or avatar would be acceptable. User Requirements are such that it will be clear that they will carry throughout the project to Implementation, Testing and Evaluation.

17 AO1 – Level 4 “Candidate demonstrates an excellent knowledge and understanding of the design process. This is evidenced by the provision of a highly detailed storyboard that relates specifically to the design process and acknowledges and addresses modifications as required.”

18 AO1 – Level 4 Storyboards were well developed on the whole but these should include the following: Some outline drawings of the proposed solution, these could be by hand which then can be scanned as pdf to include in report. An overall initial template should be used as ideally the majority of screens used in the solution should be uniform and have a consistent look and feel them. Reference should made to the functionality for each screen linked to the proposed events e.g. pseudocode snippet.

19 AO1 – Level 4 “Candidate demonstrates excellent knowledge and understanding of the development and implementation of the solution by providing an appropriate, valid explanation of the event driven application. This is evidenced by an explanation showing a highly detailed understanding of triggers, multiple forms and menus and how the application is linked to simple files in the solution.”

20 AO1 – Level 4 “Candidate shows excellent knowledge and understanding of the need for a robust and dependable system. This is evidenced by the inclusion of a comprehensive test plan that tests all navigation and data capture.”

21 AO1 – Level 4 Form Design with controls labelled appropriately
Detailed Control property tables with correct use of standards for naming of various controls Functionality explained through algorithm or pseudo code (no need for code) Link back to user requirements where applicable. Designs should be of sufficient quality that they could be given to a programmer for implementation – third party implementation! To reduce documentation a standard template for Form Style (e.g. font, background colour, placement of controls e.g. navigation buttons, logo)

22 AO1 Issues in 16/17 The majority of centres used a quiz as a solution with only a few centres encouraging the use of a games based solution as an alternative. But for the majority of the games based solutions implemented there was a higher level of programming skills demonstrated. Some requirements contain specific high level computing terminology, and not from the point of view of the client. The requirements must be carried through the whole project.

23 AO1 Issues in 16/17 Lack of feedback on the initial designs from client. Lots of repeated information in the designs that can be removed by the use of a template. In design no link to specific events and triggers to elements on the forms. This should also include proposed interaction to external files.

24 AO1 Issues in 16/17 Testing Still too many repetitive tests and related screen shots documented, very time consuming for the candidate. Lack of failed tests recorded. Encourage user/peer testing which may highlight bugs within the solution.

25 AO2 – Level 4 “Candidate demonstrates excellent application of knowledge and skills with regard to the use of GUI objects in an event driven application. This is evidenced by the provision of a relevant detailed event driven application with a comprehensive range of relevant screen shots and graphics.”

26 AO2 – Level 4 Take into consideration a holistic approach to the event driven application from the view of the user. Overall playability and the user experience and the user interface should be considered. Encourage candidates to use bigger screens where applicable as some solutions had small screens and the interface was squashed.

27 AO2 – Level 4 “Forms show excellent understanding of the requirements of the application.” “Forms are logically ordered and clearly fit for purpose as a consequence of thorough user evaluation. This is evidenced by the accuracy, layout and organisation of the forms.”

28 AO2 – Level 4 “Candidate shows excellent application of knowledge and skills in relation to the solution to the problem. This is evidenced by an excellent solution that meets all user requirements in terms of screen navigation, data capture and output produced. This is evidenced by a comprehensive range of screenshots and by detailed reference to the code produced.”

29 AO2 – Level 4 “Candidate shows excellent application of knowledge and skills in relation to testing the solution. This is evidenced by a comprehensive range of screenshots demonstrating the implementation of the test plan. The screenshots show all navigation and all data capture being tested.”

30 AO2 – Level 4 Moderators will run the game and view the code used.
A variety of methods of interaction should be explored for enhanced user participation. Binary files should be considered for the permanent storage of records and text files for questions and answers. Program code should include relevant comments relating to each method, class etc.

31 AO2 Issues in 16/17 Still issues with screen size and general HCI issues. Some solutions had a colorful background meaning that it was hard to read the text. Text was also hard to read due to small size used. General issues around placement of elements in the form that did not line up, which looks unprofessional to the user. Forms names need to be meaningful.

32 AO2 Issues in 16/17 A lack of external files, text and binary files, used for the permanent storage of records. For some candidates there is a lack of meaningful comments used especially relating to each method, class etc. Validation needs to be implemented to a better standard in the solutions, with a number of solutions crashing when the wrong data types are used or when multiple selections are used when answering questions.

33 AO2 Issues in 16/17 Be wary of using local path names, for example there were some solutions that tried to read from an external file had a local path name or referenced a file that did not exist on the solution submitted. Candidates should be reminded to use validation checks where applicable e.g. using try catch, get set and encourage the use of specific or custom exceptions. Some candidates did not rename forms when implementing a solution.

34 AO2 Issues in 16/17 General Game Play issues:
Some screens where a timer was used did not give enough time to read the questions. Multiple forms opening at the same time which in some cases crashed the system. Scoring unclear and hard to know if question was right or wrong, i.e. lack of ongoing feedback. Randomization not used, ideally for a quiz based solution, the user would be getting different experience each time. High score table not stored or ordered correctly.

35 AO3 – Level 4 “Candidate demonstrates an excellent analysis and evaluation of the outcomes of the testing procedures and the results obtained.” “Candidate comprehensively evaluates the solutions in the final report with comprehensive reference to initial user requirements, features and functionality of the solution.”

36 AO3 – Level 4 Candidates should list the user requirements and link to the testing to show evidence of how these were met or what corrective action should be taken. Encourage candidates to record failed tests as the majority of test plans only reference to perfect testing. Candidates probably test as they go and correct errors as they code.

37 AO3 – Level 4 “Candidate comprehensively evaluates their own performance in terms of time management and development of personal skills. Candidate comprehensively identifies how their own performance could be improved.”

38 AO3 – Level 4 It would be beneficial if students carried out some beta/end user testing and recorded evaluations by perspective end users. Completed end user testing documents (e.g. completed questionnaires) should be in an Appendix at end of report, do not disrupt the flow of the report This feedback from users would highlight limitations and areas for improvement

39 AO3 - Level 4 “Relevant material is succinct, well organised and presented with a high degree of clarity and coherence.” “Use of specialist vocabulary and spelling, punctuation and grammar is excellent.”

40 AO3 – Level 4 Evaluations should reference any user requirements that were not met and valid reasons why they were not met. The evaluation should reference future developments or improvements for the solution. Reference to updated design or functionality not included in the user requirements should also be listed.

41 AO3 Issues in 16/17 Some evaluations submitted lacked any reference to the full range of requirements documented in AO1, which is required to demonstrate the robustness of the solution. The commentary in AO3 should be reflective in nature. Encourage the use of the external/peer testing and related feedback were appropriate to highlight limitations and areas for improvement in AO3. In general AO3 was for the majority of centres, generally well completed.

42 Exemplars & Table Marking


Download ppt "GCE Software Systems Development"

Similar presentations


Ads by Google