Molly, 14, from Harrow, north-west London, saw a large amount of online material, including some relating to anxiety, depression, self-harm and suicide, in the months before she died in November 2017. Her father, Ian Russell, has become a prominent campaigner for the regulation of social media platforms to better protect young people from harmful content. The investigation has been delayed several times due to legal and procedural issues, including requests from Instagram owner Meta to redact content to protect users’ privacy. The hearing will include witnesses from Meta and Pinterest in person after the senior coroner, Andrew Walker, said at a preliminary hearing this month that appearing by video link could create difficulties in viewing evidence at the same time. Meta’s head of health and wellbeing policy, Elizabeth Lagone, and Jud Hoffman, head of community business at Pinterest – both based in the US – are due to give evidence at the inquest at Barnet coroner’s court. Lawyers representing Pinterest and Meta – which also owns Facebook and WhatsApp – had argued that Lagone and Hoffman could testify remotely in the two-week hearing, citing issues such as their full work schedule and risk to catch Covid. Speaking at a preliminary hearing this month, Walker said: “The examination of the witnesses will involve the witnesses looking at videos, referring to documents, and that will be best done in the context of the court itself.” Meta revealed 12,576 pieces of Instagram content Molly had seen in the six months before her death, while she had over 15,000 engagements on Pinterest, including 3,000 rejections, in the last six months of her life. During that time, Molly was engaging in Instagram posts about 130 times a day on average. This included 3,500 shares during that time frame, as well as 11,000 likes and 5,000 shares. Meta will not be required to name the handles of anonymous Instagram accounts viewed by Russell after the company produced guidance from the UK’s data watchdog stating that disclosing such details would breach data laws. Meta added that it is concerned about “potentially identifying vulnerable Instagram users.” The research takes place against a backdrop of regulatory changes for social media companies in the UK. The internet safety bill, whose progress through parliament has stalled, imposes a duty of care on tech companies to protect users from harmful content. The duty of care includes ensuring that children are not exposed to harmful or inappropriate content. The prime minister, Liz Truss, confirmed the bill would go ahead, saying the government wanted to ensure under-18s were protected from harm, but also “make sure freedom of speech is allowed”. The new security regime will be overseen by the communications regulator, Ofcom, which will have the power to impose fines of up to £18m, or 10% of global turnover, for breaches of the law.