SonyAI’s cover photo
SonyAI

SonyAI

Research Services

Sony AI is Sony's flagship AI organization whose mission is to "Unleash Human Imagination and Creativity with AI."

About us

Sony AI is Sony's flagship AI organization whose mission is to "Unleash Human Imagination and Creativity with AI".

Website
https://www.ai.sony/
Industry
Research Services
Company size
51-200 employees
Headquarters
Tokyo
Type
Public Company
Founded
2020

Locations

Employees at SonyAI

Updates

  • View organization page for SonyAI

    25,027 followers

    Most AI datasets are built using scraped images pulled from the internet without consent. These images are often labeled through guesswork, and lacking demographic transparency. But what happens when we take a different approach? “A Fair Reflection” is our short film about FHIBE—a new benchmark dataset created by our AI Ethics Team with consent, self-reported demographic data, and a commitment to fairness in representation. Watch the film and explore how FHIBE is raising the bar for evaluation datasets. 👉 https://lnkd.in/g59tjgJj

  • View organization page for SonyAI

    25,027 followers

    The images we use to train AI shape how it sees the world, and how it sees us. But today’s benchmarks often fail to reflect the full diversity of human identity. FHIBE is a new kind of benchmark. Built from global, consented, self-reported data, it offers a new standard for evaluating fairness across computer vision tasks. 🎬 Watch, “A Fair Reflection,” our short film exploring why this matters. 👉 Visit the FHIBE site to access the benchmark and watch the film: https://lnkd.in/g59tjgJj

  • View organization page for SonyAI

    25,027 followers

    AI is moving fast — but without ethics, it risks leaving people behind. That’s why Sony AI created the Fair Human-Centric Image Benchmark (FHIBE), the first consent-driven dataset designed to evaluate AI fairness in human-centric computer vision. Just published in Nature Magazine, FHIBE introduces a new way to benchmark AI models for ethics and fairness. It embeds consent, transparency, and global representation directly into the dataset itself — helping researchers uncover new kinds of bias and design systems that aim to serve everyone. From A Fair Reflection, our new short film about FHIBE’s creation, Michael Spranger, President of Sony AI, explains why this work matters for shaping the next generation of AI responsibly. Discover more at https://lnkd.in/g59tjgJj — watch the full film and explore the benchmark. #FHIBE #FairAI #AIFairness #EthicalAI #SonyAI

  • View organization page for SonyAI

    25,027 followers

    Bias in AI doesn’t just live in algorithms. It starts in the data we use. Just published in Nature, FHIBE — the Fair Human-Centric Image Benchmark — was built to challenge that foundation with consent, diversity, and fairness at its core. Our short film, A Fair Reflection, captures how and why FHIBE was created, why it matters, and how the AI community can build better. Watch the film now → https://lnkd.in/gv4bZF6H #FHIBE #AIEthics #FairAI #AFairReflection #SonyAI

  • View organization page for SonyAI

    25,027 followers

    Today marks a major milestone for Sony AI: we’re proud to unveil FHIBE (Fair Human-Centric Image Benchmark), the first globally diverse, consent-driven dataset designed to benchmark fairness in AI. For three years, our team worked across 81 countries, gathering more than 10,000 images from nearly 2,000 participants — with explicit, revocable consent and fair pay — to create a dataset rigorous enough to challenge today’s most advanced models. FHIBE is more than data. It’s a blueprint for how AI can be developed responsibly. With FHIBE, developers can measure bias in face detection, pose estimation, and vision-language models before deployment, so AI can better serve everyone. Our research is now published in Nature Magazine, and we hope FHIBE sparks the next phase of ethical AI — where fairness evaluation becomes standard practice. 📖 Dive into the full story, access FHIBE and see why this dataset is a pivotal step for AI: https://bit.ly/3JKaqnm #FHIBE #AIethics #FairnessInAI #ResponsibleAI #SonyAI

  • View organization page for SonyAI

    25,027 followers

    AI can’t be fair if the data isn’t. So what if fairness started at the dataset? At Sony AI, we’ve spent years examining how bias enters datasets through scraped images, shortcut learning, unchecked assumptions. We’ve published on mitigating bias without group labels. On measuring diversity instead of claiming it. On fairness as a lifecycle, not a checkbox. And now, we’re building something new shaped by everything we’ve learned. Before there’s a model, there’s a dataset. That’s where ethical AI begins. 📖 Read the blog → https://bit.ly/4hKdnAP #SonyAI #EthicalAI #DatasetBias #AIresearch #FairnessInAI #ICML2024 #NeurIPS2024

Affiliated pages

Similar pages

Browse jobs