Rotating a single photo to reveal new angles has long been a challenge. GenWarp tackles this with a new framework for generating 3D views from just one image. 👉 Read the full research paper to see how it works: https://bit.ly/4rAne0O
SonyAI
Research Services
Sony AI is Sony's flagship AI organization whose mission is to "Unleash Human Imagination and Creativity with AI."
About us
Sony AI is Sony's flagship AI organization whose mission is to "Unleash Human Imagination and Creativity with AI".
- Website
-
https://www.ai.sony/
External link for SonyAI
- Industry
- Research Services
- Company size
- 51-200 employees
- Headquarters
- Tokyo
- Type
- Public Company
- Founded
- 2020
Locations
-
Primary
Get directions
1-7-1 Konan
Minato-ku
Tokyo, 1080075, JP
Employees at SonyAI
Updates
-
SonyAI reposted this
Introducing the Gran Turismo Power Pack, bringing hardcore contests like 24-hour races to the game. Available for purchase later this year, along with the free Spec III update: https://bit.ly/47FmG1C
-
Alice Xiang, Global Head of AI Governance at Sony Group Corporation, and Lead Research Scientist at Sony AI, explains how FHIBE set a new bar for dataset evaluation: global diversity, GDPR-style consent, and scientific rigor. Learn more at https://lnkd.in/g59tjgJj — watch A Fair Reflection and access the benchmark. #FHIBE #FairAI #AIFairness #EthicalAI #SonyAI
-
Most AI datasets are built using scraped images pulled from the internet without consent. These images are often labeled through guesswork, and lacking demographic transparency. But what happens when we take a different approach? “A Fair Reflection” is our short film about FHIBE—a new benchmark dataset created by our AI Ethics Team with consent, self-reported demographic data, and a commitment to fairness in representation. Watch the film and explore how FHIBE is raising the bar for evaluation datasets. 👉 https://lnkd.in/g59tjgJj
-
The images we use to train AI shape how it sees the world, and how it sees us. But today’s benchmarks often fail to reflect the full diversity of human identity. FHIBE is a new kind of benchmark. Built from global, consented, self-reported data, it offers a new standard for evaluating fairness across computer vision tasks. 🎬 Watch, “A Fair Reflection,” our short film exploring why this matters. 👉 Visit the FHIBE site to access the benchmark and watch the film: https://lnkd.in/g59tjgJj
-
Behind every dataset are decisions about people, consent, and representation. FHIBE—the Fair Human-Centric Image Benchmark—was three years in the making. Built by Sony AI, it rethinks how datasets can be created responsibly, with consent, diversity, and transparency at the core. Discover the insights from the AI Ethics team that built FHIBE: https://bit.ly/3LObF5r #SonyAI #FHIBE #EthicalAI #ResponsibleAI #AIEthics
-
AI is moving fast — but without ethics, it risks leaving people behind. That’s why Sony AI created the Fair Human-Centric Image Benchmark (FHIBE), the first consent-driven dataset designed to evaluate AI fairness in human-centric computer vision. Just published in Nature Magazine, FHIBE introduces a new way to benchmark AI models for ethics and fairness. It embeds consent, transparency, and global representation directly into the dataset itself — helping researchers uncover new kinds of bias and design systems that aim to serve everyone. From A Fair Reflection, our new short film about FHIBE’s creation, Michael Spranger, President of Sony AI, explains why this work matters for shaping the next generation of AI responsibly. Discover more at https://lnkd.in/g59tjgJj — watch the full film and explore the benchmark. #FHIBE #FairAI #AIFairness #EthicalAI #SonyAI
-
Bias in AI doesn’t just live in algorithms. It starts in the data we use. Just published in Nature, FHIBE — the Fair Human-Centric Image Benchmark — was built to challenge that foundation with consent, diversity, and fairness at its core. Our short film, A Fair Reflection, captures how and why FHIBE was created, why it matters, and how the AI community can build better. Watch the film now → https://lnkd.in/gv4bZF6H #FHIBE #AIEthics #FairAI #AFairReflection #SonyAI
-
Today marks a major milestone for Sony AI: we’re proud to unveil FHIBE (Fair Human-Centric Image Benchmark), the first globally diverse, consent-driven dataset designed to benchmark fairness in AI. For three years, our team worked across 81 countries, gathering more than 10,000 images from nearly 2,000 participants — with explicit, revocable consent and fair pay — to create a dataset rigorous enough to challenge today’s most advanced models. FHIBE is more than data. It’s a blueprint for how AI can be developed responsibly. With FHIBE, developers can measure bias in face detection, pose estimation, and vision-language models before deployment, so AI can better serve everyone. Our research is now published in Nature Magazine, and we hope FHIBE sparks the next phase of ethical AI — where fairness evaluation becomes standard practice. 📖 Dive into the full story, access FHIBE and see why this dataset is a pivotal step for AI: https://bit.ly/3JKaqnm #FHIBE #AIethics #FairnessInAI #ResponsibleAI #SonyAI
-
AI can’t be fair if the data isn’t. So what if fairness started at the dataset? At Sony AI, we’ve spent years examining how bias enters datasets through scraped images, shortcut learning, unchecked assumptions. We’ve published on mitigating bias without group labels. On measuring diversity instead of claiming it. On fairness as a lifecycle, not a checkbox. And now, we’re building something new shaped by everything we’ve learned. Before there’s a model, there’s a dataset. That’s where ethical AI begins. 📖 Read the blog → https://bit.ly/4hKdnAP #SonyAI #EthicalAI #DatasetBias #AIresearch #FairnessInAI #ICML2024 #NeurIPS2024