This paper examines ‘open’ artificial intelligence (AI). Claims about ‘open’ AI often lack precision, frequently eliding scrutiny of substantial industry concentration in large-scale AI development and deployment, and often incorrectly applying understandings of ‘open’ imported from free and open-source software to AI systems. At present, powerful actors are seeking to shape policy using claims that ‘open’ AI is either beneficial to innovation and democracy, on the one hand, or detrimental to safety, on the other. When policy is being shaped, definitions matter. To add clarity to this debate, we examine the basis for claims of openness in AI, and offer a material analysis of what AI is and what ‘openness’ in AI can and cannot provide: examining models, data, labour, frameworks, and computational power. We highlight three main affordances of ‘open’ AI, namely transparency, reusability, and extensibility, and we observe that maximally ‘open’ AI allows some forms of oversight and experimentation on top of existing models. However, we find that openness alone does not perturb the concentration of power in AI. Just as many traditional open-source software projects were co-opted in various ways by large technology companies, we show how rhetoric around ‘open’ AI is frequently wielded in ways that exacerbate rather than reduce concentration of power in the AI sector.
The 175-page report, “‘I Learned How to Say No’: Labor Abuses & Sexual Exploitation in Colombian Webcam Studios,” exposes working conditions in webcam studios in Bogotá, Cali, Medellín, and Palmira, where models record content that is broadcasted by adult platforms and streamed around the world. Webcamming is a global industry in which studies estimate that platforms keep between 50 and 65 percent of what viewers pay. People interviewed said that studios retain as much as 70 percent of what is paid out by the platform, reducing the pay of workers. Adult webcam platforms based in the United States and Europe should immediately address labor abuses and sexual exploitation in Colombian webcam studios.
A young Tibetan woman living in Northern India takes a trip back to her village in Tibet to visit family. For the last two years she has been working for Drewla, a Tibetan NGO that provides ways for Tibetans inside Tibet to connect with the diaspora online. The trip does not go as planned.
When she reaches the Nepalese-Tibetan border she is immediately taken into detention and held for two months. Chinese authorities interrogate her about her employment in Dharamsala. The young woman denies being involved in any political activities and insists she went to Dharamsala for studies. The authorities presented a stack of chat transcripts from conversations she has had online. They explained to her that they have been monitoring Drewla and knew about its activities. They eventually released the woman and allowed her to travel to her village with a message for her colleagues back in Dharamsala: “You are not welcome to return to Tibet”
The proliferation of surveillance technology in recent years has quietly transformed the landscape of personal privacy and security. Tools once reserved for law enforcement and intelligence agencies are now readily available to the general public, often marketed under the guise of child safety (parental control) or employee monitoring.
In a joint investigation with The First Department, The Citizen Lab uncovered spyware covertly implanted on the phone of a Russian programmer following his release from Russian custody. The Monokle-like spyware allows an operator to track the device’s location, record phone calls, keystrokes, and read messages from encrypted messaging apps.
MATRIX, a messaging service made by criminals for criminals, was first discovered by Dutch authorities on the phone of a criminal convicted for the murder of a Dutch journalist in 2021. A large-scale investigation into the messaging service was initiated. It was soon clear that the infrastructure of this platform was technically more complex than previous platforms such as Sky...
Forests are critical spaces that shape and enable gendered subjectivities in culturally and historically specific ways. However, scholarly work on forest or biodiversity conservation continues to take a very perfunctory view on gender–environment relationships. Many projects remain gender blind or view everyday practices of forest resource collection by women through a transactional or economic lens. Research has shown that forests are spaces wherein identities of women are entwined with their everyday activities in the forest. In this article, we demonstrate the gendered nature of forests of the Corbett Tiger Reserve (CTR) in India, and their different socio-cultural framings. We reveal how the forest spaces of the CTR are used by women for a wide variety of cultural and livelihood needs. We further show how biodiversity conservation practice in such forest spaces alters the activities of women in a myriad of ways. The increasing use of digital technologies in biodiversity conservation shapes how the forest space is observed and governed. We argue that the use of digital technologies for forest governance such as camera traps and drones tends to transform these forests into masculinized spaces that extend the patriarchal gaze of society to the forest. Finally, we reflect on how the use of digital technologies for biodiversity conservation is easily co-opted for purposes beyond conservation that reinforce patriarchal norms and propagate gendered structural violence.
Research from Consumer Reports on People-Search remove services in the US
We analyze the system Amazon deploys on the US “amazon.com” storefront to restrict shipments of certain products to specific regions. We found 17,050 products that Amazon restricted from being shipped to at least one world region. - While many of the shipping restrictions are related to regulations involving WiFi, car seats, and other heavily regulated product categories, the most common product category restricted by Amazon in our study was books.