Featured safeguarding

Published on October 4th, 2021 📆 | 3640 Views ⚑

0

Playing catch-up with technology is not enough when it comes to safeguarding issues


https://www.ispeech.org

In 2020, the Age Appropriate Design Code, incorporating a range of design features relating to duty of care became law, and, following a 12-month transition period, online services now have to comply with it. Further legislation is under way with an Online Safety Bill expected to become effective in late 2023/early 2024.Ā 

ā€œThe Online Safety Bill introduces the concept of a regulator for the internet. Ofcom will have powers to ensure that the major internet companies demonstrate a duty of care to their users. The bill seeks to ensure that children will not be exposed to online harm, and that companies can and will be fined if they fail in their duty of care,ā€ explains Carolyn Bunting of Internet Matters.

Although the Online Safety Bill is regarded as a broadly workable model, the NSPCC believe it needs strengthening to:

  • Stop grooming and abuse spreading between apps
  • Disrupt abuse at the earliest possible stage
  • Fix major gaps in child safety duty, since high-risk sites such as Telegram and OnlyFans could be excluded because only companies with a ā€˜significantā€™ number of children on their apps would be required to protect them, resulting in high-risk services possibly being displaced to smaller sites
  • Senior management must be held accountable, with companies liable for criminal sanctions if duty of care is not upheld
  • Commit to a statutory user advocate for children.

Sonia Livingstone from London School of Economics (LSE) points out a further problem with both the Online Safety Bill and the Age Appropriate Design Code, since neither cover technology used in schools for learning or safeguarding, ā€œbecause the contract is not provider-to-user but provider-to-school, the legal responsibility seems to be with the school rather than the digital providerā€.

ā€œThe bill seeks to ensure that children will not be exposed to online harm, and that companies can and will be fined if they fail in their duty of careā€Ā ā€“ Carolyn Bunting, Internet Matters





She adds, ā€œGiven the pace of technological change, it is vital for schools and also businesses to make use of anticipatory strategies like data protection impact assessments, safety by design and child rights impact.ā€

CRIA (Child Rights Impact Assessment) was introduced as a way of assessing the impact of policies and programmes on childrenā€™s rights.Ā  Consideration is now under way by the Digital Futures Commission as to the feasibility of using CRIA as a means of embedding childrenā€™s best interests in a digital world.

Creators of new systems are more interested in devising product than in issues of safeguarding. With technology constantly evolving, the risk is that legislators and educators play ā€˜catch-upā€™ rather than taking the initiative.

Source link

Tagged with: ā€¢ ā€¢ ā€¢ ā€¢ ā€¢ ā€¢ ā€¢



Comments are closed.