Azure OpenAI services: potential business use cases

Contents

Not so long ago, news broke on the upcoming broad cooperation between Microsoft and OpenAI, which is to include Microsoft’s search engine  Bing, MS Teams, and Office. Unsurprisingly, since OpenAI already runs on Azure, an Azure OpenAI service is also in the works.

The news of Microsoft’s continued commitment to integrate AI capabilities into its services had hardly died down when OpenAI announced the latest version of its flagship product: ChatGPT 4.0. The version you are probably most familiar with GPT 3.5, which is freely available on the OpenAI website. Version 4.0 offers greater capabilities than its predecessor: it is 82% less likely to respond to requests for disallowed content, 40% more likely to produce factual responses, and also scores much higher on various tests.

Of these improvements, the higher percentage of factual responses bodes well in the context of the announced incorporation of AI in Bing. However, the most impressive feature of the successor to GPT 3.5 is its ability to recognize images.

The new AI chatbot is not available to casual users (they still have to make do with GPT 3.5), although it is available to Plus users and, more importantly, there is an API waiting list available to developers.

Speaking of developers, let us get back to the topic of Azure–Open AI cooperation.

 

The potential benefits of an Azure OpenAI service

When asked why Microsoft has not simply acquired OpenAI (considering Microsoft’s continued commitment to using OpenAI in its services), the technology giant’s CEO, Satya Nadella, answered that it is all running on Microsoft Azure anyway. In fact, a number of the most advanced AI models are hosted on Microsoft Azure. Incorporating OpenAI’s product into the already hugely valuable and popular MS Azure cloud service could be a real game changer, especially since other cloud services providers do not offer a similar capability.

There are also many benefits of a joint Azure–OpenAI service for a company looking to implement AI models in its operations, starting with the basic benefits of using Azure, not only as an AI optimized infrastructure but for any business purpose—be it creating cutting-edge applications or storing company data.

First, MS Azure offers trusted enterprise-grade capabilities in terms of data security and compliance with legal regulations, such as the GDPR framework in the EU. Second, it is scalable—since Azure is a cloud service, additional resources can be easily obtained if an application needs them. This also makes Azure cost effective, as the amount of resources used (and thus paid for) can be adjusted to current needs. This is one of the reasons why ever more businesses are using cloud services.

However, an Azure OpenAI service would also have some benefits for those who want to apply AI models in their operations. Training AI models requires vast amounts of data and computing power, which MS Azure can provide. As mentioned before, it was on Azure that OpenAI trained ChatGPT.

Finally, if a company is already using Azure, OpenAI services can be added on top of existing infrastructure and managed by the company’s own administrators. There would be no need to build dedicated infrastructure or worry about integrating new third-party tools, making sure they are compatible with the existing system and providing adequate security.

 

Azure OpenAI—potential use cases

An AI-optimized infrastructure with trusted enterprise-grade capabilities, such as the Azure OpenAI service, opens the way to democratizing AI models for entrepreneurs. From cutting-edge applications to a fine-tuned version of existing language models geared specifically to your company, there will be countless options to use the Azure OpenAI service in business. Here are just a few examples.

 

Natural language summaries

One useful way to apply AI language models to corporate needs is to create a tool to summarize documents. Every big company produces vast amounts of documentation and employees do not necessarily have the time to read all of it. Having an AI language model summarize these documents in concise natural language would save a lot of time and improve efficiency. A similar model can be used to summarize and compile product reviews from customers.

Such models already exist. The American used car retailer CarMax is using an Azure OpenAI service to create review summaries that help customers choose the best car. The AI-generated content appears on CarMax’s car research website and is approved by human staff members. The approval rate for this content has been 80%, i.e., 4 out of 5 AI-generated texts are approved for use by the employees. Most importantly, creating this much content manually would have taken (according to CarMax’s estimates) 11 years.

 

Reading your CV

A similar solution would be to analyze and summarize resumes and cover letters for HR departments. This could be based both on existing advanced AI models or a specially fine-tuned version of one of them. It might involve analyzing resumes and cover letters for keywords related to the requirements of a given position. It could identify candidates who fit the requirements and summarize their résumés and cover letters accordingly. The benefits for an HR department and the entire company include faster analysis of incoming CVs and cover letters leading to a more efficient recruitment process, time and cost savings, and greater HR team productivity.

 

Language models talking to customers

A fairly obvious solution is to create a fine-tuned version of one of the language models that could function as a company-specific chatbot that would talk to customers. It could be based on existing general language models, such as GPT 3.5, or language models trained for specific tasks. It could even be trained on company data—such fine-tuning would allow it to speak in the “company voice.” Examples of uses for such a fine-tuned version of a chatbot include customer support—it could answer customers’ questions about products and services, perhaps even solve certain common problems.

Such models would allow customers to contact the company 24/7 without the company having to hire people to work helplines around the clock. In particular, standard and common questions could be answered by natural language models, though human consultants would still be needed to solve larger problems. The human customer support workforce could thus be limited to highly trained specialists with broad access to information and permissions allowing them to actually solve major problems rather than bump them up the chain of command. This would result in savings on expenses for the company and time for the customers, as well as more convenience and a better user experience.

One problem that might require some fine-tuning of current language models (such as ChatGPT 3.5) is that they are generally too keen to respond to questions they do not have answers for. Some code would be required that would make the chatbot refer issues it cannot handle to human consultants.

 

Internal company search engine

Recently the news about advanced AI models that made the biggest splash in the media was that about Microsoft’s continued collaboration with OpenAI, including the application of the AI language model in the Bing search engine. It definitely seems to have spooked Google, which then came out with a similar announcement. However, search is not only needed on the open waters of the Internet . Another viable option is a search engine for internal company resources.

Every company produces and uses a variety of different documents, wikis, knowledge bases, procedures, etc. It can be difficult for employees to navigate through all of this information. An advanced search engine could speed up their work by allowing them to easily find the information they need by using a natural language query, such as “Find me the procedure for…” or “What does our standard contract for work look like?” The answers could come in the form of AI-generated summaries as well as links to relevant documents. What the employee needs might vary depending on what the information is to be used for. Obviously, such solutions would require these systems to be customized for the needs of a specific company, i.e., the training data for the model should be composed primarily of the company’s own material.

 

Answering your mail

One of the most typical use cases for AI-generated language models is producing texts in natural language. These might include reports, emails, newsletters, or messages to customers. However, AI capabilities are not limited to these tasks, as there is more interesting stuff that that AI models can do with your emails.The average office worker spends over two hours a day responding to emails. Some of this work could be outsourced to AI models. 

Such a solution would involve the engine reading all the incoming emails, putting them into categories based on their content, perhaps even prioritizing them so that you know which messages to respond to first, and even generating responses. So, a sales department might prioritize messages that are potential sales opportunities, HR would see candidate CVs separately from messages from employees, while customer support might be told which complaints require the most prompt response. A pre-generated response to most emails could be provided that could be sent after a quick review by an employee.

There are significant benefits to basing such a solution on the Azure OpenAI service, especially if the company is already using MS Azure to host its email. Obviously, having access to all the data and processing in one place would speed up the whole process, but most importantly it would ensure security and confidentiality, as no data is sent outside of the system to a third-party application.

 

MS Power Platform

One of the potential uses of an AI language model is to convert natural language into code. We already know this feature from GitHub Copilot (which I mentioned in my ). At its Build developers conference, Microsoft unveiled similar technology that will be integrated into Microsoft Power Platform. The aim is to allow users to build apps without knowing how to write computer code. Microsoft boasts that this will help citizen developers, as well as professionals, to build apps that improve business productivity or processes. There are already many business use cases for Microsoft’s low-code platform; a natural language interface would make it even more accessible.

The announced features, powered by GPT 3 (the predecessor to the current 4.0 and the freely available 3.5), allow a person to describe a programming goal using conversational language, which is then converted into code by the AI engine. Crucially, to avoid misunderstandings and errors, the AI model offers a number of choices for transforming the command into a Microsoft Power Fx formula. Microsoft stresses that at every stage there is a human in control and that the features are designed not to replace developers but more to assist people who are learning the Power Fx programming language. Since Microsoft Power is already a low-code platform and Power Fx is built on Microsoft Excel (and thus easier to use than traditional coding languages), it might be the perfect environment for this “assisted learning” model.

 

It is not all plain sailing

It has to be noted that Microsoft notes certain restrictions on the use of the Azure OpenAI service. For instance, chatbots and conversational engines must be limited to scoped questions (i.e., questions that have clearly defined boundaries or limits), which means that the AI model is only able to answer questions or perform tasks that fall within a specific domain or area of expertise. It is designed to recognize patterns and provide responses based on its training data, but it does not have the ability to understand the broader context or meaning behind the questions being asked. Therefore, the questions or tasks that can be performed by the AI model must be limited to those that fall within its defined scope.

Microsoft also restricts the use of the Azure OpenAI service for journalistic purposes. Users are not allowed to apply AI as a general content creation tool for any topic. It may also not be used to generate content for political campaigns. Similarly, writing assistance tools may only be used to rewrite or create content for specific business purposes or on predefined topics and not as general content creation tools.

Both in chatbots and text generation, general application runs the risk of creating harmful or malicious content, which is why the scope of application must be limited and content filters are recommended. Finally, if you want to apply AI to search, it is necessary to make sure that the resulting responses are all grounded in trusted source documentation.

To sum up, Microsoft warns that while Azure OpenAI offers a great many opportunities for a variety of applications, there are some important considerations to keep in mind. First, the service is not suitable for open-ended, unconstrained content generation or scenarios where general availability of up-to-date, factually accurate information is crucial, unless human reviewers are employed to verify the content and/or the model’s suitability for a given scenario.

Second, scenarios that could result in physical or psychological injury or have a consequential impact on life opportunities or legal status should be avoided. Similarly, high-stakes scenarios should be carefully considered, especially in domains such as healthcare, medicine, finance, or legal.

Chatbot scenarios should also be well-scoped and limited to a narrow domain to reduce the risk of generating unintended or undesirable responses. Last, all generative use cases require careful consideration and mitigations since they may be more likely to produce unintended outputs.

 

Conclusion

Microsoft Azure is an essential platform for many businesses and has been the basis for Microsoft’s continued commitment to cooperation with OpenAI, which uses the platform to host its product. A Microsoft Azure OpenAI service is thus hardly a surprise.

Companies looking to implement AI models in their operations could benefit from a joint Azure–OpenAI service due to Azure’s enterprise-grade capabilities, scalability, and cost-effectiveness. An Azure OpenAI service could democratize AI models for entrepreneurs and would be a great addition to businesses that are already using Azure.

Potential use cases for an Azure OpenAI service include customer support chatbots, summarizing documents, analyzing résumés, and creating an internal company search engine. Microsoft’s low-code platform—Microsoft Power and the Power Fx programming language—already create vast possibilities for low-code developers. Adding AI-supported code writing may be a true force multiplier for an already powerful system.

Sign up for the newsletter and other marketing communication

You may also find interesting:

Book a free 15-minute discovery call

Looking for support with your IT project?

Let’s talk to see how we can help.

The controller of the personal data is FABRITY sp. z o. o. with its registered office in Warsaw; the data is processed for the purpose of responding to a submitted inquiry; the legal basis for processing is the controller's legitimate interest in responding to a submitted inquiry and not leaving messages unanswered. Individuals whose data is processed have the following rights: access to data, rectification, erasure or restriction, right to object and the right to lodge a complaint with PUODO. Personal data in this form will be processed according to our privacy policy.

You can also send us an email.

In this case the controller of the personal data will be FABRITY sp. z o. o. and the data will be processed for the purpose of responding to a submitted inquiry; the legal basis for processing is the controller’s legitimate interest in responding to a submitted inquiry and not leaving messages unanswered. Personal data will be processed according to our privacy policy.

dormakaba 400
frontex 400
pepsico 400
bayer-logo-2
kisspng-carrefour-online-marketing-business-hypermarket-carrefour-5b3302807dc0f9.6236099615300696325151
ABB_logo

Need help with your software development project?

At Fabrity, we build software development teams covering all skills and project roles. We can help you augment your IT staff to meet the growing demands of your business or build a dedicated team for a complex project.

The controller of the personal data is FABRITY sp. z o. o. with its registered office in Warsaw; the data is processed for the purpose of marketing Fabrity’s products or services; the legal basis for processing is the controller's legitimate interest. Individuals whose data is processed have the following rights: access to data, rectification, erasure or restriction, right to object and the right to lodge a complaint with PUODO. Personal data in this form will be processed according to our privacy policy.

You can also send us an email.

In this case the controller of the personal data will be FABRITY sp. z o. o. and the data will be processed for the purpose of marketing Fabrity’s products or services; the legal basis for processing is the controller’s legitimate interest. Personal data will be processed according to our privacy policy.