Our projects

VIOGÉN SYSTEM

VioGén is an algorithm that determines the level of risk faced by a victim of gender-based violence and establishes her protection measures in Spain. It is the largest risk assessment system in the world, with more than 3 million registered cases. Since 2018 Eticas has reached out to the Spanish Ministry several times and offered a confidential pro-bono internal audit of the VioGén system. While this suggestion was well received, no action was taken. In 2021, Eticas decided to make an external audit of this system.

BAD DATA

If data can be bad, can decisions based on data be good? This project seeks to de-mystify some of the common assumptions about how data works, showing that bad, BadData can have real-life consequences” sometimes trivial, others life-changing and inescapable.

BIG DATA AT THE BORDER

Enhanced security measures have turned airports into dense data ecosystems, and human mobility into a continuously monitored process. Digitalising travel and border crossing is having impacts that go way beyond practicalities. Could data be changing the nature of borders, our relation to our biometric bodies and our definition of identity and belonging? 

SAFE ENVIRONMENTS

How is data used in schools? During 2017 and 2018 we explored how monitoring learning is impacting on teaching and privacy by looking at how surveillance technologies have been proliferating in education: from CCTV in bathrooms to Edtech.

UNIONS FACING THE TECHNOLOGICAL CHALLENGE

Digitization, automation and artificial intelligence are shaking up the world of work, and pointing to a future where labour may be different from what it is today. In this changing environment, since 2019 we have teamed up with trade union organisations to think collectively of these challenges and develop a technological agenda that incorporates privacy, digital labour rights and ethics standards into labour relations. In order to turn our research into practice, we have created an ever-evolving toolbox of resources that trade union representatives can use to raise their concerns about the use and impact of data intensive technologies at work.

¿QUIÉN DEFIENDE TUS DATOS?

For the last few years (2018, 2020) we have teamed up with the Electronic Frontier Foundation in the US and other counterparts in Latin America to encourage Internet companies to be on the side of their users in the defense of their privacy and in the promotion of their digital rights. We have designed a series of evaluation criteria that serve as a guide to evaluate and compare how companies treat your data and protect your rights, and rank them accordingly. These evaluation criteria are designed to go beyond what is strictly stipulated by law, as WE seek to promote best practices in defense of the privacy rights of users and customers. 

 

RESPONSIBLE OPEN DATA

Data is everywhere. In the digital age, our data footprints include information on what we do, where we go, who we know, what we have, what we like or how we feel. We generate this information while we work, walk, interact, speak, protest or search online. The activities we engage in generate data in their turn, and all this information is useful to shape services, products and cities, for instance, and to promote transparency and accountability. We teamed up with the Open Data Institute in the UK to develop practical guidelines to help organizations anonymise the data they use, publish and share. 

 

THE BARCELONA’S ICT ECOSYSTEM FROM A GENDER PERSPECTIVE

Women continue to be discriminated against in all areas and fields, but specifically in ICT. This study is a diagnosis of gender policies in ICT, where we analyze the ecosystem of information and communication technologies in Barcelona from a gender perspective to find out the causal relationships that give rise to and perpetuate the digital gender gap in the city. We evaluate and review the situation and current data from different sectors, from labour to education, but also cultural associations and public administration, to help policy-makers understand where are the gaps and opportunities to address gender discrimination.

Latest Blog Posts

Cómo proteger tu privacidad de las apps del día a día

Como revelamos en nuestro estudio ‘Mi cuerpo, mis datos, sus normas’ las apps de seguimiento menstrual, a pesar de su apariencia inocente, vulneran la privacidad de sus usuarios, pero hay muchas otras aplicaciones que utilizamos diariamente que saben más sobre nuestras vidas de lo que podríamos imaginar y, pueden estar

Read More »

Cómo proteger tu privacidad de las apps del día a día

Como revelamos en nuestro estudio ‘Mi cuerpo, mis datos, sus normas’ las apps de seguimiento menstrual, a pesar de su apariencia inocente, vulneran la privacidad de sus usuarios, pero hay muchas otras aplicaciones que utilizamos diariamente que saben más sobre nuestras vidas de lo que podríamos imaginar y, pueden estar

Read More » Algorithms, Policing

Tips to understand how AI impacts young people

August 12, 2022 Alpha and Z generations have been born with AI as one of the tools that is present in their daily lives. They can be called… Read More Algorithms

NarxCare, an algorithm to predict the risk of narcotic, sedative and stimulant abuse

August 11, 2022 Opioid abuse has, over the last few decades, grown to become a crisis. Different responses have been developed to handle the crisis with one solution,… Read More Algorithms

The environmental cost of AI: a savior or a destroyer?

August 5, 2022 Artificial intelligence is often presented as the new solution to protect, among other subjects, the environment. In fact, AI has already been put into use… Read More Algorithms

El impacto medioambiental de la IA: ¿salvadora o destructora?

August 5, 2022 La inteligencia artificial a menudo se presenta como la nueva solución para proteger el medio ambiente. De hecho, la IA ya se ha puesto en… Read More

Latest Blog Posts

Cómo proteger tu privacidad de las apps del día a día

Algorithms, Uncategorized

Cómo proteger tu privacidad de las apps del día a día

Como revelamos en nuestro estudio ‘Mi cuerpo, mis datos, sus normas’ las apps de seguimiento menstrual, a pesar de su apariencia inocente, vulneran la privacidad de sus usuarios, pero hay muchas otras aplicaciones que utilizamos diariamente que saben más sobre nuestras vidas de lo que podríamos imaginar y, pueden estar vendiendo estos datos a terceros.  […]

Tips to understand how AI impacts young people

Algorithms, Policing

Tips to understand how AI impacts young people

Alpha and Z generations have been born with AI as one of the tools that is present in their daily lives. They can be called AI natives, as they have never been in a world without it. But, even if AI has many positive uses, such as personalized educational trajectories, there are threats for these […]

NarxCare, an algorithm to predict the risk of narcotic, sedative and stimulant abuse

Algorithms

NarxCare, an algorithm to predict the risk of narcotic, sedative and stimulant abuse

Opioid abuse has, over the last few decades, grown to become a crisis. Different responses have been developed to handle the crisis with one solution, NarxCare having a positive intention but with concerning implementation. Developed by Appriss and deployed in the United States, the algorithm combs through the multi-state Prescription Drug Monitoring Program (PDMP) database […]

The environmental cost of AI: a savior or a destroyer?

Algorithms

The environmental cost of AI: a savior or a destroyer?

Artificial intelligence is often presented as the new solution to protect, among other subjects, the environment. In fact, AI has already been put into use for this purpose; IBM applies it in its forest fire detection system ‘Bee2FireDetection’, while Microsoft has supported projects such as ‘Wild me’ in which it is used to identify animal […]

Subscribe to get updates

Sign up for our mailing list and follow us on Twitter.

We’ll only use your data (your name and email) to send you updates on our latest projects, publications, and events. Data will only be shared with MailChimp, our email communications provider and their privacy policy is here. As long as we have your data, we’ll keep it safe and secure. Feel free to unsubscribe any time. If you unsubscribe, your email will be deleted. Once unsubscribed, the Eticas Foundation won’t use your email for any other purposes. If we decide to terminate updates, we will also delete your data.