To protect young people against online harms, the Department for Education (2024: 40) requires UK schools to implement filtering and monitoring software that enables them to “block harmful and inappropriate content without unreasonably impacting teaching and learning”.
Many filtering and monitoring systems use ‘keyword monitoring’ to track language use on online devices to identify specific words or phrases (e.g. ‘bomb’) that correlate with a specific form of risk (e.g. violence). However, this poses some issues; filtering and monitoring software tends only to raise concerns if there is a direct match to a ‘keyword’, and the ‘keywords’ themselves are often isolated from their context(s) of use. This can lead to ‘false positives’, wherein a keyword match raises an automatic safeguarding concern (e.g. ‘bomb’) even if the use of the keyword was innocuous (e.g. ‘bath bomb’).
In this webinar, Doctor Charlotte-Rose Kennedy and Doctor Mark McGlashan will demonstrate how corpus linguistics methods are used to enhance current practice at Senso.cloud, a safeguarding solutions provider.
In this, they will outline a study of a 1,094,914-word corpus of online testimonies relating to suicide and suicidal ideation. Specifically, keyword, collocation, and concordance analyses are used to derive a variety of linguistic patterns that are used in natural language to express suicidal ideation (e.g. ‘am in immense pain’) providing findings that can enable more context-sensitive, empirically-based approaches to ‘keyword monitoring’.
Department for Education, 2024. Keeping children safe in education 2024: statutory guidance for schools and colleges. https://assets.publishing.service.gov.uk/media/6650a1967b792ffff71a83e8/Keeping_children_safe_in_education_2024.pdf (accessed 6.18.24).
Many filtering and monitoring systems use ‘keyword monitoring’ to track language use on online devices to identify specific words or phrases (e.g. ‘bomb’) that correlate with a specific form of risk (e.g. violence). However, this poses some issues; filtering and monitoring software tends only to raise concerns if there is a direct match to a ‘keyword’, and the ‘keywords’ themselves are often isolated from their context(s) of use. This can lead to ‘false positives’, wherein a keyword match raises an automatic safeguarding concern (e.g. ‘bomb’) even if the use of the keyword was innocuous (e.g. ‘bath bomb’).
In this webinar, Doctor Charlotte-Rose Kennedy and Doctor Mark McGlashan will demonstrate how corpus linguistics methods are used to enhance current practice at Senso.cloud, a safeguarding solutions provider.
In this, they will outline a study of a 1,094,914-word corpus of online testimonies relating to suicide and suicidal ideation. Specifically, keyword, collocation, and concordance analyses are used to derive a variety of linguistic patterns that are used in natural language to express suicidal ideation (e.g. ‘am in immense pain’) providing findings that can enable more context-sensitive, empirically-based approaches to ‘keyword monitoring’.
Department for Education, 2024. Keeping children safe in education 2024: statutory guidance for schools and colleges. https://assets.publishing.service.gov.uk/media/6650a1967b792ffff71a83e8/Keeping_children_safe_in_education_2024.pdf (accessed 6.18.24).