Introducing LORA - Trustworthy AI-Adoption for Kids
![Founders Dima Rubanov and Matthias Neumayer](/static/ed3bcf8af60d6896b29b03b15e2bf861/5d536/scientist_f.png)
At LORA, we recognize that gender bias in AI systems is a critical issue that needs to be addressed head-on. Many AI models reflect and even amplify real-world biases, including outdated gender stereotypes. For example, AI language models may associate doctors and engineers with men, while assigning roles like nurse and teacher to women. This bias stems from the human-generated data used to train AI. However, we believe AI also presents an opportunity to move beyond human biases and shape a more equitable future. That's why LORA we want to developing trustworthy, ethical AI for children.
Meet the Team
Dima Rubanov, Matthias Neumayer, Marco Marthe
Supported by
AWS AI Adoption / The Austrian Federal Ministry for Labour and Economy (BMAW) / Fachakademie Sozialpädagogik München Mitte