菜单

互联网对AVMSD的响应

男孩和女孩在ipad上的沙发上

Video sharing platforms are a key part of digital life.视频共享平台是数字生活的关键部分。 You can read our response to Ofcom's consultation on how to regulate these platforms here.您可以在此处阅读我们对Ofcom有关如何规范这些平台的咨询的回复。

Internet事务很高兴参加此欢迎的咨询。 We have a few introductory comments which frame our thinking before we get into the specific questions.在介绍具体问题之前,我们有一些介绍性意见构成了我们的思想。 Internet Matters exists to help families benefit from connected technology.存在互联网事务以帮助家庭从互联技术中受益。 We are a not for profit, funded by the internet industry – and we are pleased to bring leading brands together to focus on child safety and digital wellbeing.我们是由互联网行业提供资金的非营利组织,我们很高兴将领先品牌聚集在一起,专注于儿童安全和数字健康。 We provide expert advice to parents, presented in a really usable way, by age of the child, by device, app, or platform or by issue.我们会根据孩子的年龄,设备,应用或平台或问题,以切实可行的方式向父母提供专家建议。

We know that engaging with our content gives parents, carers and increasingly professionals the confidence and tools they need to engage with the digital lives of those they care for.我们知道,与我们的内容互动可以使父母,照护者和越来越多的专业人士获得与他们关心的人的数字生活互动所需的信心和工具。 Having an engaged adult in a child's life is the single most important factor in ensuring they are safe online, so providing those adults with the tools, resources, and confidence to act is a fundamental part of digital literacy.确保孩子们上网安全是确保孩子安全上网的最重要因素,因此,为这些成年人提供工具,资源和行为能力是数字素养的基本组成部分。

父母的监管依据和复杂性

AVMSD源于以“原产国”原则为基础的欧盟法律体系。 In practice, this means as the consultation document states, the interim regulatory framework will on cover 6 mainstream sites and 2 adult content sites.实际上,正如咨询文件所指出的那样,这意味着临时监管框架将覆盖XNUMX个主流站点和XNUMX个成人内容站点。 This will prove very hard to explain to parents who could rightly expect that if the content is viewable in the UK it is regulated in the UK.这将很难解释给父母,他们可以正确地期望,如果该内容在英国可以观看,那么该内容将在英国受到监管。

While acknowledging the limitations of the AVMSD, the consultation repeatedly notes that to the extent the forthcoming Online Harms legislation will address age-inappropriate content, it will not feel bound to honour the country of origin principle.磋商会认识到了AVMSD的局限性,但一再指出,在即将到来的在线危害立法将解决年龄不适当的内容的范围内,它不会感到必须遵守原籍国原则。 This is a position we endorse.我们赞成这一立场。 If the content is viewable in the UK, then it should conform with UK rules.如果内容在英国可见,则应符合英国规则。

AVMSD不限于这种方式。 In this case the AVMSD has got it right.在这种情况下,AVMSD正确无误。 It makes no sense to limit the scope of the Online Harms legislation to platforms which allow user generated content to be published.将在线危害立法的范围限制在允许发布用户生成的内容的平台上是没有道理的。 What matters is the nature of the content, not how or by whom it was produced.重要的是内容的性质,而不是内容的生产方式或来源。

Question 19: What examples are there of effective use and implementation of any of the measures listed in article 28(b)(3) the AVMSD 2018?问题XNUMX:有效使用和实施AVMSD XNUMX第XNUMX(b)(XNUMX)条所列任何措施的例子有哪些? The measures are terms and conditions, flagging and reporting mechanisms, age verification systems, rating systems, parental control systems, easy-to-access complaints functions, and the provision of media literacy measures and tools.这些措施包括条款和条件,标记和报告机制,年龄验证系统,评级系统,父母控制系统,易于使用的投诉功能以及提供媒体素养措施和工具。 Please provide evidence and specific examples to support your answer.请提供证据和具体示例以支持您的回答。
Our work listening to families informs everything we do – and given we are a key part of delivering digital literacy for parents we wanted to share some insights with you.我们倾听家人的工作为我们所做的一切提供了信息–鉴于我们是为父母提供数字素养的关键部分,我们希望与您分享一些见解。 Parents seek advice about online safety when one of four things happen:当发生以下四种情况之一时,家长会寻求有关在线安全的建议:

  • 家里有新设备
  • 设备上有一个新的应用程序/平台
  • 孩子们开始读中学
  • 出于多种原因,尤其是生活经验,学校提示,媒体报道等,都可能引发安全隐患。

Parents seek help most often through an online search or asking for help at school.父母最常通过在线搜索或在学校寻求帮助来寻求帮助。 Clearly throughout lockdown, searching for solutions has been more important, meaning evidenced-based advice from credible organisations must be at the top of the rankings.显然,在整个锁定过程中,寻找解决方案更为重要,这意味着来自可靠组织的基于证据的建议必须居于首位。

Once parents are engaged with advice it has to be easy to understand – and so we regularly poll parents on what they would think talk and do differently after engaging with our website.父母征求意见后,必须易于理解-因此,我们会定期与父母进行一次民意测验,询问他们在与我们的网站互动后会说些什么,以及他们会做什么。 The charts below demonstrate that serving parents content that meets their requirements drives meaningful and measurable change.下图显示,为父母提供的满足他们要求的内容可以带来有意义且可衡量的变化。

These data points indicate that digital literacy amongst parents can and is influenced by good quality resources – which equip them to have routine conversations with their children about their digital lives.这些数据表明,父母之间的数字素养能够而且受高质量资源的影响-这些资源使他们能够与子女进行有关其数字生活的例行对话。 Moreover, our pages on parental controls consistently rank in the top 10 most popular pages.此外,我们在家长控制方面的页面始终排在最受欢迎的前XNUMX名页面中。

Question 20: What examples are there of measures which have fallen short of expectations regarding users' protection and why?问题XNUMX:有哪些措施未能达到用户保护的预期?为什么? Please provide evidence to support your answer wherever possible.请提供证据以尽可能支持您的回答。
We have to conclude that moderation of live streaming is not working currently and perhaps cannot work, abuse of platforms terms and conditions happens in real-time.我们必须得出的结论是,实时流媒体的管理目前无法正常工作,甚至可能无法正常工作,平台条款和条件的滥用是实时发生的。 In the following two examples it's not simply terms and conditions that were abandoned, it was much more serious.在下面的两个示例中,不仅放弃了简单的条款和条件,更严重的是。 The tragic recent suicide was circulated globally within seconds and although platforms took quick and decisive action too many people saw that harrowing content on mainstream apps, with little or no warning as to graphic content.最近发生的悲剧性自杀事件在几秒钟内在全球范围内流传,尽管平台采取了快速果断的行动,但仍有太多人看到主流应用程序上的内容令人垂涎,而图形内容却很少或没有警告。 As we all know, this wasn't the only example of live streaming moderation failure, as the Christchurch shootings highlighted back in March 2019.众所周知,这不是现场流媒体监控失败的唯一例子,因为克赖斯特彻奇枪击事件早在XNUMX年XNUMX月就凸显出来。

Clearly, these are complex issues where someone deliberately sets out to devastate lives through their own actions and their decision to live stream it.显然,这些都是复杂的问题,有人故意着手通过自己的行动和决定直播生活来破坏生活。 Of course, the two examples are not comparable save in what we can learn from them and what a regulator could meaningfully do in these situations.当然,除了我们可以从中学习到的内容以及监管机构在这些情况下可以做的有意义的事情以外,这两个例子是不可比较的。 Perhaps it is in the very extreme and exceptional nature of this content than comfort can be found – in that in nearly every other circumstance this content is identified and isolated in the moments between uploading and sharing.也许是在这种内容的极端极端性质中,无法找到舒适感–因为几乎在所有其他情况下,在上载和共享之间的瞬间,该内容都被识别和隔离了。 Clearly, these are split second decisions which are reliant on outstanding algorithms and qualified human moderators.显然,这些决定是瞬间决定的,这取决于出色的算法和合格的人工主持人。 Perhaps the role of the regulator in this situation is to work with platforms onto which such content can be or was uploaded and viewed and shared to understand and explore what went wrong and then agree concrete actions to ensure it cannot happen again.在这种情况下,监管机构的作用可能是与可以在上面上传或查看并共享这些内容的平台一起使用,以理解和探索出了什么问题,然后商定具体措施以确保不再发生。 Perhaps those learnings could be shared by the regulator in a confidential way with other platforms, simply for the purpose of ensuring lessons are learnt as widely as possible – for the protection of the public, and where appropriate for the company to provide redress.监管机构也许可以与其他平台以机密的方式分享这些经验,只是为了确保尽可能广泛地吸取教训,以保护公众,并在适当的情况下让公司提供补救。 For that, to work the culture of the regulator and its approach has to be collaborative and engaging rather than remote and punitive.为此,要发挥监管者的文化及其方法的作用,就必须具有协作性和吸引力,而不是遥不可及的惩罚性措施。

监管机构可能希望部署的建议可能包括(但不限于)要求公司制定计划以共同努力以确保在各个平台之间立即共享通知,因为将这些信息保存在一个平台内没有商业优势。

The other issue that requires detailed consideration are comments under videos – be that toddlers in paddling pool, or teenagers lip-synching to music videos.另一个需要详细考虑的问题是视频下的注释-是戏水池中的幼儿,还是青少年对音乐视频进行口型同步。 Perhaps there are two separate issues here.也许这里有两个独立的问题。 For the accounts of young people between 13-16, unless and until anonymity on the internet no longer exists, 'platforms should be encouraged take a cautious approach to comments, removing anything that is reported and reinstating once comment has been validated.'对于XNUMX至XNUMX岁之间的年轻人,除非直到互联网上不再存在匿名性,否则“应该鼓励平台采取谨慎的评论方式,删除举报内容,并在评论通过验证后恢复。”

We would encourage the regulator to continue to work with platforms to identify videos that although innocent in nature, attract inappropriate comments and suspend the ability to comment publicly under them Often account holders have no idea who the comments are being left by and context is everything.我们鼓励监管机构继续与平台合作,以识别虽然本质上是无辜的视频,但会吸引不适当的评论并中止在其下公开发表评论的能力。通常,帐户持有人不知道是谁留下评论,而上下文就是一切。 A peer admiring a dance move, or an item of clothing is materially different from comments from a stranger.欣赏舞蹈动作或穿着一件衣服的同伴与陌生人的评论有本质的不同。

For as long as sites are not required to verify the age of the users, live streams will be both uploaded and watched by children.只要不需要网站来验证用户的年龄,实时流就将被儿童上传和观看。 Children have as much right to emerging technology as anyone else – and have to be able to use it safely.儿童拥有与其他任何人一样的新兴技术权利,并且必须能够安全地使用它。 So, the challenge for the regulator becomes how to ensure children who are live streaming can do so without inappropriate contact from strangers.因此,监管者面临的挑战是如何确保直播中的孩子能够做到这一点,而又不会引起陌生人的不适当接触。

Whilst many young people tell us they like and appreciate the validation they receive from comments, the solution isn't to retain the functionality.尽管许多年轻人告诉我们他们喜欢并欣赏从评论中获得的验证,但解决方案并不是保留功能。 It's to stop it and invest the time and money in understanding what is happening in the lives of our young people that the validation of strangers is so meaningful to them.就是要停止它,并花时间和金钱来了解我们年轻人的生活,以至于对陌生人的确认对他们来说意义非凡。

对于父母在划水池中张贴学步儿童的图像,既有技术上也有教育上的回应。 应该可以只在私人模式下查看图像,以便陌生人无法发表评论。 其次,应该对父母进行教育性活动-可能始于准妈妈和助产士之间的对话,讨论适合在网上发布以供全世界观看的孩子的婴儿寿命。 监管机构可以在挑战已迅速成为常态的表演卷轴生活方式中发挥作用。

Question 21: What indicators of potential harm should Ofcom be aware of as part of its ongoing monitoring and compliance activities on VSP services?问题XNUMX:Ofcom作为VSP服务正在进行的监视和合规活动的一部分,应注意哪些潜在危害指标? Please provide evidence to support your answer wherever possible.请提供证据以尽可能支持您的回答。
在过去的18个月中,Internet Matters已投入大量时间和资源来了解弱势儿童的在线体验,特别是与非弱势儿童的在线体验有何不同。 我们的报告 数字世界中的弱势儿童 2019年XNUMX月发布的数据表明,弱势儿童的在线体验截然不同。 Further research demonstrates that children and young people with SEND are at particular risk as they are less able to critically assess contact risks and more likely to believe people are who they say they are.进一步的研究表明,患有SEND的儿童和年轻人面临的风险特别大,因为他们对批判接触风险的批评能力较弱,并且更有可能相信人们的真实身份。 Likewise, care experience children are more at risk of seeing harmful content, particularly around self-harm and suicide content.同样,有护理经验的孩子更容易看到有害成分,尤其是在自残和自杀成分周围。 There are many more examples.还有更多示例。

The point here is not the vulnerable young people should have a separate experience if they identify themselves to the platforms, but more than the regulator and platforms recognise that there are millions of vulnerable children in the UK who require additional support to benefit from connected technology.这里的要点不是弱势年轻人是否应该在平台上拥有自己的经历,而是监管者和平台认识到的是,英国有数百万弱势儿童需要更多支持才能从互联技术中受益。 The nature of support will vary but will inevitably include additional and bespoke digital literacy interventions as well as better content moderation to remove dangerous content before it is shared.支持的性质各不相同,但不可避免地将包括其他定制的数字扫盲干预措施,以及更好的内容审核,以在共享危险内容之前将其删除。

Youthworks与Internet Matters合作发布的2019年网络调查的数据表明:

  • 2019年,有13%的人曾看到过鼓励自残或自杀的内容
  • 2019年个人在线遭受种族主义欺凌或侵略的比例高于2015年; 13%,而4%
  • 2019%,而2015%
  • 内容风险比接触风险更常见:
    • (虽然健身是积极的,但如果要鼓励年轻人使用可能未贴有标签的物质,则堆积可能有害)

已经很脆弱的青少年经常看到关于自我伤害的内容,特别是那些饮食失调的青少年(23%)或有言语障碍的青少年(29%),而没有弱势的年轻人中只有9%的青少年“看到过”,只有2%的人“经常”这样做。

Question 22: The AVMSD 2018 requires VSPs to take appropriate measures to protect minors from content which 'may impair their physical, mental or moral development'.问题XNUMX:AVMSD XNUMX要求VSP采取适当措施,保护未成年人免受``可能损害其身心,道德或道德发展''的内容的侵害。 Which types of content do you consider relevant under this?您认为与此相关的哪些类型的内容? Which measures do you consider most appropriate to protect minors?您认为哪些措施最适合保护未成年人? Please provide evidence to support your answer wherever possible, including any age-related considerations.请提供证据以尽可能支持您的答案,包括任何与年龄有关的注意事项。

除了我们对问题21的回答中详述的自残和自杀内容之外,还有其他几种类型的内容可能会损害儿童的身心,道德发育,包括(但不限于)

  • Pornography and all other adult and sexualised content that surround pornography.色情内容以及围绕色情内容的所有其他成人和色情内容。 This also includes the impact this content has on children's perceptions of healthy relationships, consent and the role of women.这还包括该内容对儿童对健康关系,同意和女性角色的认知的影响。 Our report –我们的报告– 我们需要谈论色情 在父母对年龄验证的支持中详细介绍了这些问题
  • 暴力-暴力的正常化以及某些类型的音乐和帮派文化中内容的含意可能会造成严重破坏
  • 犯罪活动-从使用VSP招募未成年人到县界以及美化迷人的生活方式,有一系列有害内容鼓励犯罪
  • Gambling, smoking and alcohol, knives – children should not be able to gamble online – it's illegal offline and should be both illegal and impossible online.赌博,吸烟和酗酒,砍刀–儿童不应在线赌博–在离线状态下是非法的,并且在网上也是非法且不可能的。 Likewise, there are age restrictions on the sale of restricted items;同样,限制物品的销售也有年龄限制; tobacco, alcohol and weapons and this should mean it is impossible for children to review this content in the form of an advert that glamourises it or be presented with an opportunity to purchase it烟草,酒精和武器,这应该意味着儿童不可能以引诱其的广告形式或带给其购买机会的形式审查该内容
  • 意识形态/激进主义/极端主义–虽然我们不会试图限制言论自由,但儿童和年轻人值得特别保护,以使他们不受激进和极端的意识形态和内容的束缚

Perhaps the way to consider this is by reviewing and updating the content categories the Internet Service Providers use for blocking content through the parental control filters.考虑这一点的方法可能是通过检查和更新Internet服务提供商用于通过家长控制过滤器阻止内容的内容类别。 All user-generated content should be subject to the same restrictions for children.所有用户生成的内容对于儿童均应遵守相同的限制。 What matters to the health and wellbeing of children is the content itself, not whether the content was created by a mainstream broadcaster or someone down the road.对于儿童的健康和福祉而言,重要的是内容本身,而不是内容是由主流广播公司还是未来的某人创作的。

什么措施最适合保护未成年人?

  • 受限制,某些内容不应只提供给某些受众
  • 大量使用启动画面警告来识别合法但有害的内容
  • 针对创建内容违反平台条款和条件的用户的严厉措施
  • 成人内容的年龄验证和13-​​16岁的用户的年龄保证

迫切需要这样做,因为我们的数据显示,锁定期间在线危害的体验有所增加。

Question 23: What challenges might VSP providers face in the practical and proportionate adoption of measures that Ofcom should be aware of?问题XNUMX:VSP提供商在切实和按比例采用Ofcom应注意的措施时会面临哪些挑战? We would be particularly interested in your reasoning of the factors relevant to the assessment of practicality and proportionality.我们特别希望您对与实用性和相称性评估相关的因素进行推理。
在此处有一个非常明确的要求的非法内容与存在混乱世界的合法但有害的内容之间的区别可能会很有用
This is a serious and complex problem which will require significant work between the platforms and the regulator to resolve.这是一个严重而复杂的问题,需要平台与监管机构之间进行大量工作才能解决。 Given the Government is minded to appoint Ofcom to be the Online Harms Regulator there will be as much interest in how this is done as in that it is done.鉴于政府有意任命Ofcom担任在线危害监管者,因此与如何做到这一点一样,人们将对此感兴趣。 Precedents will we set, and expectations created.我们将设定先例,并创造期望。

Question 24: How should VSPs balance their users' rights to freedom of expression, and what metrics should they use to monitor this?问题XNUMX:VSP应该如何平衡其用户的表达自由权,以及应使用哪些指标来监控这一点? What role do you see for a regulator?您对监管者的角色是什么?

  • Clarity on community guidelines on what is appropriate and not and what will be acceptable / tolerated.明确社区准则中什么是合适的,什么是不可接受的,什么是可以接受/容忍的。 Abuse it and you're off.滥用它,您就离开了。 Freedom of expression is not curtailed because you could find another platform to express those views – but they are not acceptable on this one没有限制言论自由,因为您可以找到另一个平台来表达这些观点,但是在这个观点上它们是不可接受的
  • 指标-流行率,删除量和报告
  • 监管者的作用是确保社区标准得到实施,并意识到人们将按照所有规则来推动和逃避这些规则,因此还需要人为调节和常识性
  • 监管机构需要认识到教育也是其中的关键部分,因此应鼓励/看重/给予优惠的折扣,鼓励/关注那些投资于旨在提高数字素养的独立教育计划的VSP。

(请参阅第25段和附件2.32中的第28(b)(7)条)。 Please provide evidence or analysis to support your answer wherever possible, including consideration on how this requirement could be met in an effective and proportionate way.请提供证据或分析以尽可能支持您的回答,包括考虑如何以有效和适当的方式满足此要求。

互联网事务对此问题没有评论

问题26:Ofcom如何最好地支持VSP继续创新以确保用户安全?

  • 认可可以通过强有力的评估来证明影响的应用程序投资和媒体/数字素养干预
  • 确保他们认识到法规遵从不仅仅是内容删除-正如爱尔兰模式一样,它必须包括使有害内容传播和扩散最小化的措施
  • 明确要求的意图–再次按照爱尔兰的模式,在危害最小化的周期中,所采取措施的直接结果是,随着时间的流逝,接触有害物质的人数会显着减少
  • Make reporting of concerning content as easy as uploading content and keep reporters aware of processes and likely resolution timescales.使有关内容的报告像上载内容一样容易,并使记者了解流程和可能的解决方案时间表。 This should include clearly published response times that meet a minimum standard and keep users informed.这应该包括明确发布的响应时间,该响应时间符合最低标准并及时通知用户。 Additionally, we suspect that some of the wording around reporting content is off-putting for children, so suggest some work is done to identify the most appropriate wording and process for young people so that they are more likely to flag this content.此外,我们怀疑报告内容中的某些措辞对儿童不利,因此建议您进行一些工作,以确定最适合年轻人的措辞和流程,以便他们更有可能标记此内容。 Additionally, there needs to be a sustained effort on the part of the platforms to restore confidence in their reporting mechanisms so that users of all ages believe that something will happen if they make a report此外,平台方面需要做出持续的努力,以恢复对其报告机制的信心,以便所有年龄段的用户都相信,如果他们做出报告,将会发生一些事情。
  • Make reporting easy for minors – so test with them the most appropriate way to do that by the platform.让未成年人易于举报–因此,与他们一起测试平台上最合适的举止方法。 Is complex, specific language best for young people, or would softer language like “I don't like this” or “this content makes me unhappy” be more effective?是复杂的,特定的语言最适合年轻人,还是柔和的语言(例如“我不喜欢”或“此内容使我不高兴”)更有效? Additionally, prioritise their concerns and perhaps trial what happens if reports from minors are removed and then examined and reinstated if required.此外,应优先考虑他们的问题,并尝试将未成年人举报移除,然后根据需要进行检查和恢复。 If we really wanted to make the internet a safe place for children, we would focus on their needs – on the platforms they are likely to frequent如果我们真的想使互联网成为儿童的安全之所,我们将关注他们的需求-他们可能会经常使用的平台

问题27:Ofcom如何最好地支持企业遵守新要求?

  • Recognise the limitations in the scope and timing of the requirements – and message them accordingly.认识到需求范围和时限方面的限制,并相应地发出消息。 If the regulations only apply to 6 or 8 organisations, don't overclaim – they will not be world-leading.如果该法规仅适用于XNUMX个或XNUMX个组织,请不要夸大其词-它们不会成为世界领先。 This is important so that parents are realistic about what changes the requirements will bring about and will not become less vigilant because they believe there is a regulated solution这一点很重要,因此父母要对要求会带来什么变化并保持警惕性持现实态度,因为他们认为这是一种规范的解决方案
  • Recognise that size is not a pre-requisite for the existence of risk and harm, and that in every other consumer product domain business cannot put less safe or more risky products on the market because they are small.认识到规模并不是存在风险和危害的前提,并且在每个其他消费产品领域中,企业都不能将安全性较低或风险较高的产品投放市场,因为它们很小。 Micro-breweries have the same legal requirement to comply with all appropriate health and safety regulations as Coca-Cola.微型啤酒厂与可口可乐公司具有相同的法律要求,以遵守所有适当的健康和安全法规。 It's the same for toy manufactures and film producers.对于玩具制造商和电影制片人来说都是一样的。 The right to be safe, or in this case is not harmed is absolute and not dependent on the size of the organisation you are consuming a product or service from安全或在这种情况下不受到损害的权利是绝对的,并且与您使用产品或服务的组织的规模无关

问题28:您对第2.49段中规定的一套原则(保护和保证,言论自由,随着时间的流逝的适应性,透明度,强有力的执行力,独立性和相称性)以及平衡有时在他们?

  • Irish proposals recognise this is an iterative process, so welcome sentiments to be agile and innovative.爱尔兰的提案认识到这是一个反复的过程,因此欢迎大家提出敏捷和创新的想法。 The focus of regulation in compliance with codes, rather than personal behaviour – but still need a place to educate so that behaviour is addressed.法规的重点是遵守法规,而不是遵守个人行为-但仍然需要一个教育场所,以便解决行为。 The Law Commission's current consultation on online crimes is also an interesting intervention here as such changes to the law will create legal and therefore cultural clarity around what is acceptable and legal behaviour online法律委员会目前在网络犯罪方面的咨询也是一种有趣的干预措施,因为对法律的这种更改将围绕可接受的内容和在线行为在法律和文化上建立清晰性
  • Freedom of speech and expression concerns can be addressed through terms and conditions – so there may be a place where your extreme views are welcome – but this isn't the appropriate platform for that.言论自由和表达自由的担忧可以通过条款和条件解决-因此可能会有一个欢迎您提出极端观点的地方-但这不是实现此目的的合适平台。 Not suggesting you can't express those views but simply stating you cannot do that on this platform不建议您不能表达这些观点,而只是说您不能在此平台上做到这一点
  • 认识到未成年人的年龄验证所面临的挑战和年龄保证中的误差幅度以及这些技术的不可避免的局限性

更多探索

查看更多文章和资源,以帮助儿童保持在线安全。

最近的帖子