Microsoft is making the ChatGPT AI large language model available in preview as a component for apps built for the company’s Azure OpenAI service, paving the way for developers to integrate the large language model into a host of different applications. enterprise development and end-user applications.
Microsoft appears to have had a number of users working with this integration already, listing ODP Corporation (the parent company of Office Depot and OfficeMax), Singapore’s Smart Nation Office of Digital Government, and contract management software provider Icertis as reference customers. .
Developers using Azure OpenAI Service can use ChatGPT to add a variety of features to applications, such as recap call center conversations, automate claims processing, and even create new ads with custom content, among other things.
Generative AI like ChatGPT is already being used to improve Microsoft’s product offerings as well. For example, according to the company, the premium version of Teams can use AI to create chapters in conversations and auto-generated summaries, while Viva Sales can offer data-driven targeting and suggest email content to help teams reach their customers. customers.
Business use cases for Azure OpenAI ChatGPT
Ritu Jyoti, IDC group vice president for global AI and automation research, said the proposed use cases make a lot of sense and that she expects much of the initial use of Microsoft’s new ChatGPT-powered offering to be focused internally within applications. companies.
“For [example]help HR H H. to develop job descriptions, help employees with management and discovery of internal knowledge; in other words, grow employees with internal search,” she said.
Pricing for the service works by tokens: one token covers roughly four characters worth of a given query in written English, with an average paragraph scoring 100 tokens and a 1,500-word essay around 2,048. According to Jyoti, one of the reasons Why GPT-3 based applications became more popular right before ChatGPT went viral is that the OpenAI foundation price dropped to around $0.02 per 1000 tokens.
ChatGPT via Azure costs even less, at $0.002 per 1,000 tokens, making the service potentially cheaper than using an internal large language model, he added.
“I think the price is great,” Jyoti said.
Microsoft appears to be operating the service with an emphasis on responsible AI practices, according to Gartner vice president and distinguished analyst Bern Elliot, perhaps having learned lessons from incidents in which a chatbot front-end for Bing displayed odd behavior, including a conversation with a new York Times Reporter in which the chatbot declared its love for him and tried to convince him to leave his spouse.
“I think Microsoft, historically, has taken responsible AI very seriously, and that’s to their credit,” he said. “Having a strong track record of ethical use and providing enterprise-grade privacy is positive, so I think that’s in their favor.”
That’s a key consideration, he said, given the concerns raised by the use of AI in the enterprise: data protection and contextualization of data sets, more specifically. The latter issue typically centers around making sure enterprise AI gets answers from the correct information base, ensuring those answers are correct, and eliminating the “hallucinations” seen in more general-purpose AI.
Copyright © 2023 IDG Communications, Inc.
Be First to Comment