Newsletter, March 2023: TIP panels at #PSA23; Supreme court ruling on Section 230;  EP harmonises online political campaigning; & more

Newsletter, March 2023: TIP panels at #PSA23; Supreme court ruling on Section 230;  EP harmonises online political campaigning; & more


  • Working paper series: Send us your ideas;  TIP panels at #PSA23;
  • Supreme court ruling on Section 230
  • EP harmonises online political campaigning
  • Privacy risks according to the CDT
  • Other stories we’re reading
  • Jobs & opportunities

Hello TIPers,

This month, we remind you of our call for expressions of interest in a new working paper forum – see details below and get in touch with your ideas and papers for some friendly and supportive feedback. Plus, we are excited to share our specialist TIP group panels that will take place at PSA Liverpool on the 3-5 April.

Our top stories this month include the Section 230 debacle on Big Tech’s online harm responsibility, the EP’s large margin approval of online campaigning rules for EU members (and foreign stakeholders) and CDT’s advice to amend EU and US access to social media for researchers.

If you would like to highlight your own or colleagues’ work through the TIP newsletter, let us know by emailing us here, or message us on Twitter!

See you next month,




Working Paper Series

Send us your ideas!

We are proposing to create a Working Group Forum to bring TIP members together in one (mostly virtual) space to offer feedback and support within a kind and supportive community. 

With this in mind, we’re asking members to come forward with any project or work in progress which you might want feedback on. This could include, but will not be limited to:

  • An early draft of a paper
  • A grant application
  • A presentation
  • Or anything else!

The idea is to provide a friendly forum in which to share ideas and gain feedback, and will require you to give a short overview of your work and/or circulate a paper in advance for feedback and comment. If you have a piece of work you think would be suitable then please get in touch with your ideas!

We’re hoping to start these sessions before the summer, so feel free to send suggestions for new or early projects. Sessions will be for members only, so if you know anyone who might be interested in joining, make sure to point them towards our membership link (remember, it’s free!).

73rd PSA Annual Conference

Coming to Liverpool soon
3-5 April 2023

 If you haven’t already seen it, the Conference programme is out, and we are delighted to share our 4 TIP panels, which will take place at the following times and locations. We hope to see many of you there! 

Unlike other specialist groups, we will not be holding our AGM at PSA in Liverpool, but we are having an informal get together at 11.30 on the Monday so do check the programme for details of where to find us.


Monday, 3 April

1:30pm Panel:  AI & Data Regulation, 
Venue – Panel Room 22, Chaired by: Dr. Declan McDowell-Naylor

  • « The regulatory ecosystem of data driven campaigning: Evidence from four european countries » Dr. Andrew Barclay, Prof. Rachel Gibson, Prof. Kate Dommett
  • « Critical issues in Artificial Intelligence governance research » Dr. Inga Ulnicane
  • « The future of AI development, safety, and democracy – a question of forward-looking responsibility » Dr. Maria Hedlund
  • « Implementing AI in the Swedish public sector – a struggle on knowledge? » Prof. Malin Rönnblom, Dr. Andreas Öjehag, Dr. Vanja Carlsson 

3:30pm – Panel:  International Cyber Security
Venue – Panel Room 22,  Chaired by: (TBC) 

  • « Paving the path by exploiting the pervasiveness and vulnerabilities of cyberspace to go from digitalization to weaponization » Mr. Nawaf Alessa
  • « Between fragmentation and integration: the role of the UN in cybercrime rulemaking » Ms. Xing Fan
  • « Understanding successful cyber governance: The case of cyber crisis preparedness » Ms. Jieqiong Wu 

Tuesday, 4 April

9:30am-  Panel: New Digital Methods in Policymaking,
Venue – Panel Room 22, Chaired by: Dr. Liam McLoughlin

  • « Analysing the public model of social participation designed by central government of Brazil:  “Participa + Brasil” » Ms. Jurema Luzia Ribeiro Pereira
  • « Rural voices: future-proofing digital inequality policy initiatives » Mrs. Cate Hopkins
  • « Deconstructing diplomatic signals with natural language processing » Mr. Kyrre Berland, Dr. Martin Wählisch
  • « Employing a Bayesian spatial following model to estimate the ideological positions of UK MPs on Twitter » Mr. Conor Gaughan 

1:30pm – Panel:  Social Media and Politics, 
Venue – Panel Room 22,  Chaired by: Dr. Liam McLoughlin

  • « How to identify political influencers online: by reference and reactions, or by hashtags? » Dr. Esmeralda Bon, Prof. Rachel Gibson, Dr. Fabienne Greffet, Mr. Peter Smyth
  • « Voice: Feminist grassroots mobilizations in Putin’s Russia » Ms. Natalia Kovyliaeva
  • « Moderating political communities: A study of moderation practices on political subreddits » Dr. Liam McLoughlin
  • « Facebook as a crisis communication tool for local governments and citizens’ engagement » Mrs. Anastasia Kani, Dr. Amalia Triantafillidou, Prof. Georgios Lappas                 

Latest News, Research, & Opportunities

Section 230 ruling: Big Tech and the Terrorism Act

Much of the tech policy world has been talking about last month’s US Supreme Court hearings on Section 230, putting Big Tech firms’ liability for online harm under the microscope. Section 230 is a 1996 legal provision of the Communications Decency Act that protects online platforms from legal responsibility for user-generated content. It has come under fire in recent years due to increasing attention attributed to online harassment, hate speech and disinformation. But only recently has it been cited in court.

The two heavily mediatised court cases, Twitter v. Taamneh and Gonzalez v. Google are allegations of harmful effects of online content filed by families of victims of ISIS terrorist attacks in 2015 and 2017. The question of accountability has been linked to Section 230 via simultaneous referral to the 2016 Justice Against Sponsors of Terrorism Act (JASTA). JASTA permits any US national who is injured by an act of international terrorism to sue anyone who provides substantial assistance to those who commit the act. Unfortunately, this law is incredibly vague and has led to sensationalist hype around the Supreme Court’s interpretation.

As is often the case (particularly in US courts), the devil is in the detail. In the Twitter case, the lawsuit is not a case of complicity but of the platform’s role in recruiting and fundraising. In the Google case, the lawsuit targets the search engine’s discretion in algorithmically recommending content, which in some instances promotes acts of terrorism. The Supreme Court has the sticky and consequential task of drawing the line where users’ responsibility for content ends and Big Tech’s accountability to its wider consumer base begins.

Thus far, Justices are more split over platforms’ responsibility in recruiting and fundraising terrorism (Twitter) than their role in content recommendation (Google). The rulings will not be for another few months, but the discussion and final decision will influence the way Big Tech content moderation is framed in the US. Although this is important and worth the attention it is receiving, it is unlikely that it will break the internet.

The debacle highlights the vacuum in US legislation on the harmful effects of online content. Lawmakers should be focusing on improving mechanisms for moderating content and preventing recommendation algorithms from spreading harm, rather than playing the blame game.

Further articles

European Parliament votes to transform online political campaigning

In anticipation of the European Parliament elections in 2024, the body has voted to harmonise campaigning rules across the EU, with mention of both offline and online advertising.

The law, presented by the European Commission in November 2021 and approved for negotiations by the Council in December 2022, proposes that political advertising should be limited to 60 days before an election and should be allowed to target users based only on three pieces of information: language, location and whether they are first-time voters. It was written with platforms like Facebook and Google in mind, to prevent parties from sending contradictory personalised messages to different groups of voters. The EP suggested to append two additional clauses: first, to ban non-EU groups from buying political ads shown in its member states, and second, to require Big Tech to publicly publish the ads being displayed during elections.

As has been the case in the last couple of years, the EU has become a trendsetter in gradually-but-surely incorporating digital elements into its legislation. This reaction to increasing concern for voter manipulation, either by local actors or foreign detractors, is welcomed by academics and practitioners. Although it will not halt attempts of sowing distrust among electorates, it is a first attempt at striking a balance between protecting freedom of expression and addressing data-driven voter manipulation prior to elections.

The bill is now with the European Council, which has less than a year to discuss any amendments before the next EP elections. Whether it is passed into legislation by then or not, the 2024 elections will prove essential for its reassessment.

Further articles

Risks in providing researchers social media data access

The Center for Democracy and Technology (CDT) report on ‘Defending Data: Privacy Protection, Independent Researchers, and Access to Social Media Data in the US and EU’ delves into the legal protections for stored social media and how access of data for independent researchers may make it easier for law enforcement personnel to access the same data.

In Europe, Article 40 of the EU’s Digital Services Act plans to provide vetted researchers, including civil society organisations, with access to social media users’ data. While this is critical to develop an understanding of the political effects of online platforms, improve transparency and accountability in tech, and identify human rights violations more efficiently, it could also provide a route for increased surveillance, particularly in countries with authoritarian leaning governments.

In the report, the CDT advises policymakers to amend current frameworks with measures to mitigate this adverse effect such as the exclusion of law enforcement agencies or agents from being vetted as researchers, preventing researchers from sharing data with third parties and requiring researchers to destroy data after a certain time period. This will be essential to keep law enforcement away from delicate personal information, and to keep researchers from losing access to crucial data.

Further articles

Other news, books and stories we are reading

Short form

  • After Susan Wojcicki’s stepping down from YouTube, Big Tech has no women CEOs Bloomberg
  • Xiaohongshu becomes an online oasis for trans people in China Rest of the World
  • OpenAI unveiled a tool that detects text produced by ChatGPT, but with only a 26% success rate MIT Tech Review
  • Issues with the Indian Government’s Attempt to Flag ‘Fake’ or ‘False’ Content (a post-Modi documentary affair) Tech Policy Press
  • A popular Filipino app, Lyka, that did not shy away from monetising its users’ content, was suspended by the Philippines’ central bank Rest of the World
  • The recent Nigerian election was wrought with misinformation, particularly on Twitter Integrity Institute

Books & Articles

Jobs & Opportunities!


Call for Papers


  • EU/UK Policy Lead, Reddit (UK / remote)
  • Programme Specialist Recommendation on the Ethics of AI, UNESCO (France)
  • Trust & Safety Policy Lead, among other positions, Epic Games (US several locations)
  • Research Director in Internet and Technology, Pew Research Center (Washington, US)
  • Senior Policy Manager, Partnership on AI (CA or Remote, US)


Want to share your own updates, research, jobs, events, publications, and more with the network? Simply get in contact with us at, or tag us on Twitter @PSA_tip (You can follow us while you’re there too)

You can join TIP and get our newsletters direct to your inbox by joining here.

Add a Comment

Your email address will not be published.