Google and Microsoft Execs On AI Literacy and Responsible Use

Google and Microsoft Execs On AI Literacy and Responsible Use

“The value of tech is not created by its inventors; it’s unlocked by its users,” she said

She also encouraged marketers to share ideas with each other to gain insights on how other industries are using AI. 

“We are all on an even playing field,” she said. “Everybody’s learning and we’ve got to figure out how to share together.”

Buddy Phillips, senior director of sales enablement and responsible AI Lead at Microsoft.

Responsible AI

Microsoft’s Phillips said that when it comes to AI usage, trust has to be earned. 

Per the University of Queensland and KPMG, 73% of people are concerned about the risks of AI, while 75% will trust AI if companies can prove they are using the technology responsibly. 

Philips encouraged companies to put into practice AI framework to earn the trust of customers and clients. He pointed out that only 11% of executives report having implemented fundamental responsible AI capabilities in an effort to build trust with customers, per PwC.

To implement AI responsibly, Phillips said companies must consider the purpose of their use of AI, technical capabilities and liabilities, and sensitive uses. “Remember: sociotechnical, human-centered approach,” he said. 

He added that Microsoft’s own AI principles that are supported by “transparency and accountability.”

Last year, the company announced its Copilot Copyright Commitment. Essentially, if a customer is sued for copyright infringement for using its Copilot services, it will defend them and pay the amount of any adverse judgments or settlements that arise from the lawsuit.

“We want to make that our issue, and not theirs,” said Phillips. 



Source link


Discover more from Сегодня.Today

Subscribe to get the latest posts sent to your email.

Leave a Reply

Your email address will not be published. Required fields are marked *

Discover more from Сегодня.Today

Subscribe now to keep reading and get access to the full archive.

Continue reading