What are the dangers and risks for children in the virtual space?
Tech majors are increasingly finding themselves in the midst of a maelstrom of protests across the world.
Its not just over privacy concerns, but also with the security of users online.
Across the world, parents and activists are aggressively advancing the agenda of having the tech companies take responsibility, or provide platforms that are ‘safe by design’ for children and young users.
A UNICEF report of last year, ‘The Metaverse, Extended Reality and Children’, attempted an analysis of how virtual environments may evolve
and how they are likely to influence children and young adults.
These technologies do offer many potential benefits for children, such as in the areas of education and health.
What is the responsibility of tech companies?
The primary responsibility is that of the tech companies who will have to incorporate ‘safety by design.
The proceedings of the Congressional hearings have made it obvious that these companies are fully cognisant of the extent to which their apps and systems impact children negatively.
Drawing on the Convention on the Rights of the Child, UNICEF offers guidance that lists nine requirements for child-centred AI, including support for children’s development and well-being, and protecting children’s data and privacy.
UNICEF recommends that tech companies apply the highest existing data protection standards to children’s data in the metaverse and virtual environments.
In addition, governments have the burden of assessing and adjusting regulatory frameworks periodically to ensure that such technologies do not violate children’s rights, and use their might
to address harmful content and behaviour inimical to children online.
Ultimately, as Ms. Suresh points out, everyone must start from the assumption that all the rules that exist in the real world to protect children, should also prevail online.
Should public pressure be stepped up to incorporate more safety aspects, especially for use by children?
The potential risks to children are significant, the report points out.
These include safety concerns such as exposure to graphic sexual content, bullying, sexual harassment and abuse, which in immersive virtual environments can feel more ‘real’ than on
current platforms.”
Further, vast amounts of data, including about non-verbal behaviour are collected, potentially allowing a handful of large tech companies to facilitate hyper- personalised profiling, advertising and increased surveillance, impacting children’s privacy, security, other rights and freedom.
While the complete immersion in an alternate reality which Metaverse promises is still not here, there are multiple virtual environments and games that are not fully immersive, and yet indicative of dangers in coping with that world.
Centre for the Prevention and Healing of Child Sexual Abuse, “in the hugely popular Grand Theft Auto, which does have adult and child versions, there is an instruction in the adult version to ‘approach a prostitute and spank her many times’.
More recently, she adds, there were reports in the media about how children were using Artificial Intelligence to generate indecent child abuse images.
Then there is the mental health aspect, with children facing the prospect of trauma, soliciting and abuse online, which can leave deep psychological scars that impact lives in the real world too.
Innocuous and innocent sharing of images online can also be twisted by depraved predators.
End-to-end encryption is essential to protect the information that children share online.
How can governments step in with regulatory frameworks?
The issue of regulating internet use for children is complex and raises various concerns about effectiveness, privacy, and freedom of expression.
Some potential ways governments could approach this issue through regulatory frameworks:
Content filtering and access restrictions:
Age-gating: Requiring age verification for access to specific content or platforms deemed inappropriate for younger users.
Blacklisting harmful content: Blocking access to websites and content deemed harmful, violent, or exploitative towards children.
Parental controls: Providing tools for parents to manage and restrict their children's online access and activities.
Data privacy and protection:
Stronger data protection laws: Implementing regulations that limit data collection from
children, require parental consent, and ensure data security and privacy.
Transparency and accountability for platforms: Requiring platforms to be transparent about data collection practices and accountable for data breaches or misuse.
Digital literacy and online safety education: Providing age-appropriate education programs to teach children about online safety, digital citizenship, and responsible internet use.
COMMENTS