๐ŸšจLISTEN ON SPOTIFY: ๐ŸšจELECTRONIC MUSIC๐Ÿšจ& ELECTRO DANCE BEATS ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ BEST HOUSE BANGER๐Ÿ”ฅ๐Ÿ”Š๐ŸŒ THIS TRACK IS FIRE!๐Ÿ”ฅ๐Ÿšจ๐Ÿ”ฅ๐Ÿšจ๐Ÿ”ฅ...๐Ÿ˜Ž๐Ÿ‘‰STREAM HERE!!! ๐Ÿšจ๐Ÿš€๐Ÿš€๐Ÿš€๐Ÿš€๐Ÿš€๐Ÿš€โค๐Ÿ‘‹

๐ŸšจBREAKING NEWS ALERT ๐ŸšจThis new search engine is amazing!๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ BOOM๐Ÿ”ฅ...๐Ÿ˜Ž๐Ÿ‘‰Click here!!! ๐Ÿšจ๐Ÿš€๐Ÿš€๐Ÿš€๐Ÿš€๐Ÿš€๐Ÿš€โค๐Ÿ‘‹

Fairness in AI panel discussion | NeurIPS 2020 | Amazon Science

Share it with your friends Like

Thanks! Share it with your friends!

Close

One important topic for the field of machine learning is fairness in AI, which has become a table-stake for ML platforms and services, driven by customer / business needs, regulatory / legal requirements and societal expectations. Researchers have been actively studying how to address disparate treatment caused by bias in the data and the resulting amplification of such bias by ML models, and how to ensure that the learned model does not treat subgroups in the population unfairly.

During NeurIPS 2020, five Amazon scientists working on these types of challenges gathered for a 45-minute virtual session to address the topic. Watch the recorded panel discussion here, where the scientists discuss how fairness applies to their areas of AI / ML research, the interesting studies and advancements happening in the space, and the collaborations theyโ€™re most excited to see occurring across the industry in an effort to advance fairness in AI.

Learn more: https://www.amazon.science/videos-webinars/amazon-panel-to-host-virtual-event-on-fairness-in-ai

Follow us:
Twitter: https://twitter.com/AmazonScience
Facebook: https://www.facebook.com/AmazonScience
Instagram: https://www.instagram.com/AmazonScience
LinkedIn: https://www.linkedin.com/showcase/AmazonScience
Newsletter: https://www.amazon.science/newsletter