Trust and safety issues associated with OnlyFans, particularly in relation to the platform's advertising practices on social media and their potential impact on sex worker safety and privacy as well as minor protection (who are obliviously exposed to explicit materials by clicking the link).
The role of technology in shaping the future of sex work and the decriminalization movement.
There are several websites that prohibit any external links in their community guidelines.
Elissa Redmiles is currently faculty at the Max Planck Institute for Software Systems in Germany and her research focuses on understanding and modeling security, privacy, and safety behaviors using computational, economic, and social science methods. She's well-known for her comprehensive research of security advice for users, but also recent research looking at at-risk populations (including sex workers), COVID-19 apps, algorithmic fairness, misinformation, and more. If you're interested, here are some links to talks she’s given recently:
Explore the potential harms to women's reproductive health privacy resulting from the widespread collection and sharing of personal data through menstrual tracking apps and other femtech products.
[Enemy 2: Government] Government’s request for data about geo-locational information for a criminal prosecution purpose.
Examine the ways in which women's data privacy can be protected in a post-Roe v. Wade climate, with a particular focus on the potential for data collected by period-tracking apps to be used to investigate or prosecute women who have had abortions.
What would reasonable expectations of privacy test mean here?
Daniel v. Armslist, LLC, 926 N.W.2d 710, cert. denied, 140 S. Ct. 562 (2019).
Digital Services Act
“No condition is attached to obtaining the shield. Section 230 does not require any obligations such as a reasonable moderation effort or a duty to notify relevant authorities of unlawful content. On the other hand, the current version of the EU’s Digital Services Act imposes notice-and-action requirements on hosting services for unlawful content in exchange for liability immunity.”
The Articles 4 to 10 of the Digital Services Act (DSA), which replaces Articles 12 to 15 of the eCommerce Directive 2000/31/EC, outlines the rules for intermediary liability privileges, which generally provide immunity to intermediary services for the third-party content they process as long as they act quickly upon receiving notices of illegal content. The DSA clarifies that this immunity is not lost if the intermediary service performs voluntary preemptive screening or monitoring of content, as long as it is done in good faith and with diligence. The DSA also states that intermediaries shall not be deemed to have lost their immunity solely because they take measures to comply with Union and national laws. See Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC,
Perhaps are they largely governed by national laws?
Daniel Parsons 👍
Topic: AI/Machine Learning: Biases in Employment Outcomes
The best literature review!
Very large online platforms and very large online search engines
1. This Section shall apply to online platforms and online search engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online platforms or very large online search engines pursuant to paragraph 4.
Topic: The CRISPR Conundrum: Social, Bioethical, and Legal Implications of Human Genome Editing
Materialized Harm in Offline of Data Breach
Both topics are so interesting! 👍
(1) Topic 1
Potential recourses for Victims of platforms' recommendation of harmful content--exploring various legal claims (such as product liability, criminal law, campaign disclosure, sexual harassment, etc.) to determine viability.
It can be challenging to connect the dots between disparate legal fields, but doing so can yield novel and interesting insights.
(2) Topic 2: Social media feed as a self-determination.
From the perspective of the right to self-determination and personality rights, this topic discusses the problem of social media feed being fully controlled by social media companies. Section 230 was enacted to allow user-generated content to freely circulate on the Internet, but in a situation where content is overflowing, the key authority lies in determining the ranking of content, which platforms monopolize. Is there no way to solve the problem of losing agency due to this private power?
This topic takes a very philosophical approach, and I love it. I think U.S. readers should be exposed to this type of reasoning more often. I should say, I really like this topic! If you can't find much evidence in the current literature or case law, I think it's totally fine to make a high-level reasoning statement.
General suggestion: To provide a comprehensive analysis of this issue, I recommend that you take a deep dive into the literature and examine how other scholars have framed similar questions and formulated methodologies. For this class assignment, I suggest that you focus on reading 3-4 papers that can help you develop a solid framework for your research. Look for papers that discuss similar issues and examine how they have addressed the problem. After reading these papers, consider which framework you were most persuaded by and use it as a starting point for your research.
. Although conceding that the statute impeded the flow of ideas, the Court held that this impediment was subordinate to the right of people in their homes to be free from unwanted material.
The Digital Services Act
“The DSA builds on the comprehensive protection that GDPR already offers to European citizens and adds a new layer to it. In the future, online marketers will therefore have to take into account not only the GDPR and ePrivacy rules (cookies and anti-spam), but also the new restrictions imposed by the DSA, at least if their activity falls within the scope of the DSA (as already indicated, this concerns intermediary services such as social media, marketplaces and search engines).”
Article 35 Mitigation of risks 1. Providers of very large online platforms and of very large online search engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable: (e) adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide.
Article 39 Additional online advertising transparency 1. Providers of very large online platforms or of very large online search engines that present advertisements on their online interfaces shall compile and make publicly available in a specific section of their online interface, through a searchable and reliable tool that allows multicriteria queries and through application programming interfaces, a repository containing the information referred to in paragraph 2, for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete.
Article 40 Data access and scrutiny4. Upon a reasoned request from the Digital Services Coordinator of establishment, providers of very large online platforms or of very large online search engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 8 of this Article, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.
Article 41 Compliance functions. 1. Providers of very large online platforms or of very large online search engines shall establish a compliance function, which is independent from their operational functions and composed of one or more compliance officers, including the head of the compliance function. That compliance function shall have sufficient authority, stature and resources, as well as access to the management body of the provider of the very large online platform or of the very large online search engine to monitor the compliance of that provider with this Regulation.
3. Compliance officers shall have the following tasks: (b) ensuring that all risks referred to in Article 34 are identified and properly reported on and that reasonable, proportionate and effective risk-mitigation measures are taken pursuant to Article 35.
Roman Hill / Ramita Bains
Topic: How does AI’s content moderation affect children?
United States v. American Library Association, Inc., 539 U.S. 194 (2003): The United States Supreme Court ruled that the United States Congress has the authority to require public schools and libraries receiving E-Rate discounts to install web filtering software as a condition of receiving federal funding.
) which was relevant to your team’s point. This paper portrays the history of search and addresses the authors’ skeptical view about the combination of search engines and large language models, advocating for “the need for flexible tools that can support diverse modes of usage.”
Want to print your doc? This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (