JCQ guidance
You should read JCQ’s latest information for candidates about NEAs in addition to the guidance here.
Authentication statement
Before you submit your NEA for marking, you will need to sign an authentication statement in which you declare that all the work submitted is your own. If, after signing this statement, it is discovered that your declaration was false—that your NEA contains uncredited work that isn’t yours—you risk disqualification from your courses.
If you have concerns about authorship or referencing in your work, you must raise them with your teacher before signing the authentication statement. Until that point, issues can usually be resolved within school. Once the statement is signed, any concerns fall under JCQ regulations and are no longer within your teachers’ control.
It is not worth the risk or the long-term pain to include uncredited work that is not your own. By following this guidance, you can ensure you stay on the right side of acceptable academic practice.
While this might all seem severe, academic institutions value academic honesty highly. Academic dishonesty at the university level, for example, can result in your expulsion. More importantly, academic honesty is a matter of personal integrity: it is wrong to falsely represent the work of others as your own.
Avoid malpractice through referencing
When you use (directly or indirectly) work that is not your own, you must credit the original source—whether it comes from a teacher, an AI chatbot, an online tutorial, or any other source. Such credit is called a citation. Your citations must make it possible to identify the original source, when it is from, and who created it. Even if you put someone else’s idea in your own words, you must cite the original source.
You are not required to follow any particular referencing style1. You may choose to do in-text citations or use footnotes, as long as it is clear the work or ideas presented are not your own. Whichever implementation you choose, you should be consistent.
The H446 specification says that NEAs should contain a bibliography “as standard” [1].
Citations in software
The software you use to write your NEA may have support for citations built-in. Word, , and Typst all have built-in support for managing sources, inserting citations, and creating bibliographies.
JCQ on referencing
JCQ’s guidance on NEAs provides good examples of the different kinds of sources you might cite and what their citations may look like. This is what JCQ say:
When producing a piece of work, if you use the same wording as a published source, you must place quotation marks around the passage and state where it came from. This is known as referencing.
You must make sure that you give detailed references for everything in your work which is not in your own words. A reference from a printed book or journal should show the name of the author, the year of publication and the page number. For example: Morrison, 2000, p29.
For material taken from the internet, your reference should show the date when the material was downloaded and must show the precise web page, not the search engine used to locate it. This can be copied from the address line. For example: http://news.bbc.co.uk/onthisday/hi/dates/stories/october/28/newsid_2621000/2621915.stm, downloaded 5 February 2026.
Where computer-generated content has been used (such as an AI chatbot), your reference must show the name of the AI tool used and should show the date the content was generated. For example: ChatGPT 3.5 (https://openai.com/blog/chatgpt/), 25/01/2026. You should also reference the sources used by the AI tool in generating the content.
You must retain a copy of the question(s) and computer-generated content for reference and authentication purposes in a non-editable format (such as a screenshot) and provide a brief explanation of how you used it. This must be submitted with your work for final assessment so that your teacher can review the work, the AI-generated content and how it has been used.
… your bibliography must list the full details of publications you have used in your research, even where these are not directly referred to. For example: Curran, J. Mass Media and Society (Hodder Arnold, 2005).
If you copy the words, ideas or outputs of others and do not show your sources in references and a bibliography, this will be considered as cheating [2].
Cambridge OCR have their own guide to referencing that you can read for supplementary information.
Use tutorials responsibly
The 2024 Moderators Report identifies the number of students nationally who submit code produced by blindly following a tutorial (e.g., YouTube) as “a concern” [1]. The code you submit must be your own.
Cambridge OCR don’t say to avoid tutorials completely. They suggest some acceptable uses, including getting started with a new environment “while fully referencing the source of the code” [1] .
They also suggest tutorials might be useful if an “insurmountable” problem is encountered. In such cases, the use of a tutorial can be acceptable if it is properly cited, informs only a small part of the work, and the “vast majority” of the rest of the code is demonstrably your own [1].
Cambridge OCR warn that, if you use a tutorial as a starting point, you must evidence your own development that “significantly” builds on the initial work and “goes in a new direction” [1].
Cite help from your teachers
If your teachers provide help with your code, you must cite it as you would if you received help from an AI chatbot or a tutorial. Your bibliography should include the date of the conversation and who it was with, and you should describe the nature of the support received in your write-up.
Use Artificial Intelligence tools surgically
As stated in the JCQ guidance above, you must credit any work related to your NEA stemming from conversations with AI chatbots. You should include screenshots of the entire conversations in your write-up in addition to an explanation of how you use the AI tool. You should include this information where it is most relevant (e.g., in your development diary when debugging) so that your usage of AI is entirely transparent. If the conversations are lengthy, they may be best included as an appendix—though best practice would be to include your explanation of how you used the AI tool and its impact at the point of the relevant work in your write-up.
Mitigate risk
Work that is not your own cannot receive credit. You should therefore be extremely judicious when deciding when, if at all, to use AI tools to help you with your NEA. Cambridge OCR’s guidance around following tutorials applies to using AI tools too: they can be helpful to fix an isolated problem but should always be credited and should not produce the bulk of your work.
If you choose to use AI tools, you should aim to do so with surgical precision. Don’t be tempted to copy/paste your entire codebase into an AI chatbot as the risks are too high: getting feedback on your entire codebase to fix a localised bug potentially compromises your ownership of the work. Additionally, the entire conversation needs to be recorded and explained.
Be exacting in your prompts. Tell the chatbot to not give you complete solutions but instead to point you in the right direction. Consider asking for debugging approaches rather than asking the chatbot to debug the program for you. You cannot receive credit for AI-generated code but you could receive credit for acting on AI-given advice. Even if you just ask for advice, you must still document the conversation with screenshots and a written summary.
Avoid using AI code assistants inside IDEs. You must credit AI tools whenever they help you and integrated code assistants tend to offer unrestrained help, compromising the integrity of your work. Normal auto-complete (e.g., tab completion of identifiers or key words) is fine but complete code suggestions are problematic.
Don’t use AI tools to write your write-up for you. Your write-up must be your own work.
Credit all work that isn’t yours. Failing to do so is catastrophically reckless.
AI use according to Cambridge OCR
The following is quoted directly from Cambridge OCR’s own guidance on using AI in each section of the NEA. In all cases, remember that you can only receive credit for work that is your own; AI should support the direction of your work rather than do the work for you.
Analysis
Generating ideas for projects can be supported by AI. AI tools can provide great ideas and develop concepts. AI tools can help with initial project concepts, and to provide more scope to a project.
For example, a student could use ChatGPT to provide stimulus to the question: “Write me 10 game project ideas that could use OOP paradigms.”
Ideas from ChatGPT or candidates could then be developed further using stimulus: “State 10 ways that I could use power ups in this game.”
AI tools could also be used to identify similar ideas or types of projects. This may speed up the research process. However, it’s also important that this stage is not solely driven by using AI tools [4].
AI can generate great prompts. However, candidates should steer away from generating final ideas. Analysis requires interactions with stakeholders. Candidates restrict access to upper mark bands where this is limited. Overuse of AI may restrict engagement with stakeholders, as the candidates become too dependent on the AI responses [4].
Design
There is no requirement for the design to be ‘perfect’ at this stage. Indeed, overuse of AI to reach a ‘perfect’ solution at this stage hinders the iterative development – which is designed to identify flaws, and allow re-design, iterative development and therefore editing/updating testing where changes are made.
Therefore, we’d suggest that candidates steer away from using AI as much as possible in this section [4].
The design does not have to be fully completed in one go. This is due to the iterative nature of the NEA. For example, a high-level sketch or idea is often refined later after a “first attempt” has been made at building a user interface for example. After testing, these earlier designs may be refined (and documented). The project goes through another iteration, and the result of the tweaked designs are re-evaluated [4].
Development
AI tools may be used to support debugging. AI tools could also be used to suggest ideas and methods to troubleshoot non-functional code.
For example:
- a candidate could ask it to suggest how a method or object could be written. But the candidate must show clearly how this suggestion has been adapted to suit their project.
- a candidate could copy and paste a method or section of code into an AI tool and ask it to suggest why it may not be compiling, or working as expected.
- a candidate could, if totally stuck, ask AI to provide a potential solution to a short section of code, reference and integrate this, and then show how they have then resumed independent work.
There is no harm in using AI tools to support the coding process but only if all AI use is well documented and referenced. Candidates must clearly show where AI tools are used, and how their independent work develops from AI generated content [4].
Much of the development encourages independent work. However, AI tools may be used to auto-comment code and solve challenges along the way.
The key to ‘good’ use of AI is to encourage research into techniques, rather than solutions. Researching techniques allow a candidate to adapt and modify their findings to their project.
For example: Researching “shortest path algorithms” would allow a candidate to explore which pathing algorithm they would want to use and why. Researching: “Write Dijkstra’s shortest path for my code” limits their ability to show independent development [4].
Testing and evaluation
AI use in this area should be very limited … AI could be used to provide prompts for candidates to respond to. For example: “What impacts might a partially-functioning GUI have on an end user?”. These questions are fine to use as prompts to generate candidate-driven discussion, but must be referenced [4].
Use of AI to test a program is not really much help, as the tests and data will have been defined earlier in the project.
One area that AI could be misused is in the remedial action taken to resolve issues. We do not expect that every project will work perfectly. There may be bugs in the system at final completion.
However, simply copying errors/issues and getting AI tools to solve them without referencing is cheating [4].
Using AI to generate the evaluation is poor practice and will mean it is very difficult to access upper mark bands [4].
References
-
[1]Cambridge OCR 2023. A Level Computer Science Moderators’ Report H446/03/04 Summer 2023 Series. Cambridge OCR.
-
[2]Cambridge OCR 2024. A Level Computer Science Moderators’ Report H446/03/04 Summer 2024 Series. Cambridge OCR.
-
[3]Cambridge OCR 2024. A Level Specification Computer Science H446. Cambridge OCR.
-
[4]Cattanach-Chell, C. 2026. Guidance and Support for the Use of AI in A Level Computer Science NEA. Cambridge OCR Blog.
-
[5]JCQ 2025. Information for Candidates: Non-Examination Assessments. JCQ.